[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 13531 1726882411.44794: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 13531 1726882411.45213: Added group all to inventory 13531 1726882411.45215: Added group ungrouped to inventory 13531 1726882411.45219: Group all now contains ungrouped 13531 1726882411.45222: Examining possible inventory source: /tmp/network-91m/inventory.yml 13531 1726882411.66568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 13531 1726882411.66635: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 13531 1726882411.66662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 13531 1726882411.66728: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 13531 1726882411.66808: Loaded config def from plugin (inventory/script) 13531 1726882411.66810: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 13531 1726882411.66852: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 13531 1726882411.66946: Loaded config def from plugin (inventory/yaml) 13531 1726882411.66948: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 13531 1726882411.67041: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 13531 1726882411.67520: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 13531 1726882411.67523: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 13531 1726882411.67527: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 13531 1726882411.67533: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 13531 1726882411.67538: Loading data from /tmp/network-91m/inventory.yml 13531 1726882411.67655: /tmp/network-91m/inventory.yml was not parsable by auto 13531 1726882411.68065: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 13531 1726882411.68103: Loading data from /tmp/network-91m/inventory.yml 13531 1726882411.68188: group all already in inventory 13531 1726882411.68195: set inventory_file for managed_node1 13531 1726882411.68199: set inventory_dir for managed_node1 13531 1726882411.68199: Added host managed_node1 to inventory 13531 1726882411.68202: Added host managed_node1 to group all 13531 1726882411.68202: set ansible_host for managed_node1 13531 1726882411.68203: set ansible_ssh_extra_args for managed_node1 13531 1726882411.68206: set inventory_file for managed_node2 13531 1726882411.68209: set inventory_dir for managed_node2 13531 1726882411.68209: Added host managed_node2 to inventory 13531 1726882411.68211: Added host managed_node2 to group all 13531 1726882411.68212: set ansible_host for managed_node2 13531 1726882411.68212: set ansible_ssh_extra_args for managed_node2 13531 1726882411.68215: set inventory_file for managed_node3 13531 1726882411.68217: set inventory_dir for managed_node3 13531 1726882411.68218: Added host managed_node3 to inventory 13531 1726882411.68219: Added host managed_node3 to group all 13531 1726882411.68219: set ansible_host for managed_node3 13531 1726882411.68220: set ansible_ssh_extra_args for managed_node3 13531 1726882411.68223: Reconcile groups and hosts in inventory. 13531 1726882411.68226: Group ungrouped now contains managed_node1 13531 1726882411.68228: Group ungrouped now contains managed_node2 13531 1726882411.68229: Group ungrouped now contains managed_node3 13531 1726882411.68315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 13531 1726882411.68475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 13531 1726882411.68524: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 13531 1726882411.68551: Loaded config def from plugin (vars/host_group_vars) 13531 1726882411.68553: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 13531 1726882411.68560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 13531 1726882411.68574: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 13531 1726882411.68618: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 13531 1726882411.68951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882411.69046: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 13531 1726882411.69088: Loaded config def from plugin (connection/local) 13531 1726882411.69091: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 13531 1726882411.69726: Loaded config def from plugin (connection/paramiko_ssh) 13531 1726882411.69729: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 13531 1726882411.71676: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13531 1726882411.71714: Loaded config def from plugin (connection/psrp) 13531 1726882411.71717: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 13531 1726882411.72484: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13531 1726882411.72529: Loaded config def from plugin (connection/ssh) 13531 1726882411.72532: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 13531 1726882411.74625: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13531 1726882411.74668: Loaded config def from plugin (connection/winrm) 13531 1726882411.74671: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 13531 1726882411.74707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 13531 1726882411.74773: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 13531 1726882411.74846: Loaded config def from plugin (shell/cmd) 13531 1726882411.74848: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 13531 1726882411.74879: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 13531 1726882411.74953: Loaded config def from plugin (shell/powershell) 13531 1726882411.74955: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 13531 1726882411.75014: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 13531 1726882411.75433: Loaded config def from plugin (shell/sh) 13531 1726882411.75435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 13531 1726882411.75479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 13531 1726882411.75905: Loaded config def from plugin (become/runas) 13531 1726882411.75908: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 13531 1726882411.76206: Loaded config def from plugin (become/su) 13531 1726882411.76208: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 13531 1726882411.76604: Loaded config def from plugin (become/sudo) 13531 1726882411.76606: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 13531 1726882411.76639: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml 13531 1726882411.77459: in VariableManager get_vars() 13531 1726882411.77487: done with get_vars() 13531 1726882411.77632: trying /usr/local/lib/python3.12/site-packages/ansible/modules 13531 1726882411.80980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 13531 1726882411.81142: in VariableManager get_vars() 13531 1726882411.81148: done with get_vars() 13531 1726882411.81151: variable 'playbook_dir' from source: magic vars 13531 1726882411.81152: variable 'ansible_playbook_python' from source: magic vars 13531 1726882411.81152: variable 'ansible_config_file' from source: magic vars 13531 1726882411.81153: variable 'groups' from source: magic vars 13531 1726882411.81154: variable 'omit' from source: magic vars 13531 1726882411.81155: variable 'ansible_version' from source: magic vars 13531 1726882411.81155: variable 'ansible_check_mode' from source: magic vars 13531 1726882411.81156: variable 'ansible_diff_mode' from source: magic vars 13531 1726882411.81157: variable 'ansible_forks' from source: magic vars 13531 1726882411.81158: variable 'ansible_inventory_sources' from source: magic vars 13531 1726882411.81158: variable 'ansible_skip_tags' from source: magic vars 13531 1726882411.81159: variable 'ansible_limit' from source: magic vars 13531 1726882411.81160: variable 'ansible_run_tags' from source: magic vars 13531 1726882411.81160: variable 'ansible_verbosity' from source: magic vars 13531 1726882411.81199: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml 13531 1726882411.82429: in VariableManager get_vars() 13531 1726882411.82447: done with get_vars() 13531 1726882411.82462: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 13531 1726882411.83516: in VariableManager get_vars() 13531 1726882411.83531: done with get_vars() 13531 1726882411.83545: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13531 1726882411.83657: in VariableManager get_vars() 13531 1726882411.83674: done with get_vars() 13531 1726882411.83810: in VariableManager get_vars() 13531 1726882411.83823: done with get_vars() 13531 1726882411.83831: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13531 1726882411.83907: in VariableManager get_vars() 13531 1726882411.83922: done with get_vars() 13531 1726882411.84218: in VariableManager get_vars() 13531 1726882411.84232: done with get_vars() 13531 1726882411.84236: variable 'omit' from source: magic vars 13531 1726882411.84254: variable 'omit' from source: magic vars 13531 1726882411.84290: in VariableManager get_vars() 13531 1726882411.84304: done with get_vars() 13531 1726882411.84350: in VariableManager get_vars() 13531 1726882411.84365: done with get_vars() 13531 1726882411.84401: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13531 1726882411.84632: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13531 1726882411.84769: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13531 1726882411.85511: in VariableManager get_vars() 13531 1726882411.85531: done with get_vars() 13531 1726882411.85923: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 13531 1726882411.86060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13531 1726882411.87770: in VariableManager get_vars() 13531 1726882411.87796: done with get_vars() 13531 1726882411.87806: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13531 1726882411.89661: in VariableManager get_vars() 13531 1726882411.89682: done with get_vars() 13531 1726882411.89816: in VariableManager get_vars() 13531 1726882411.89834: done with get_vars() 13531 1726882411.90118: in VariableManager get_vars() 13531 1726882411.90134: done with get_vars() 13531 1726882411.90139: variable 'omit' from source: magic vars 13531 1726882411.90149: variable 'omit' from source: magic vars 13531 1726882411.90319: variable 'controller_profile' from source: play vars 13531 1726882411.90365: in VariableManager get_vars() 13531 1726882411.90379: done with get_vars() 13531 1726882411.90404: in VariableManager get_vars() 13531 1726882411.90420: done with get_vars() 13531 1726882411.90450: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13531 1726882411.90585: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13531 1726882411.90672: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13531 1726882411.91089: in VariableManager get_vars() 13531 1726882411.91110: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13531 1726882411.93714: in VariableManager get_vars() 13531 1726882411.93738: done with get_vars() 13531 1726882411.93744: variable 'omit' from source: magic vars 13531 1726882411.93755: variable 'omit' from source: magic vars 13531 1726882411.93794: in VariableManager get_vars() 13531 1726882411.93812: done with get_vars() 13531 1726882411.93838: in VariableManager get_vars() 13531 1726882411.93856: done with get_vars() 13531 1726882411.93894: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13531 1726882411.94030: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13531 1726882411.94112: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13531 1726882411.94524: in VariableManager get_vars() 13531 1726882411.94553: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13531 1726882411.96766: in VariableManager get_vars() 13531 1726882411.96797: done with get_vars() 13531 1726882411.96802: variable 'omit' from source: magic vars 13531 1726882411.96815: variable 'omit' from source: magic vars 13531 1726882411.96849: in VariableManager get_vars() 13531 1726882411.96870: done with get_vars() 13531 1726882411.96890: in VariableManager get_vars() 13531 1726882411.96916: done with get_vars() 13531 1726882411.96945: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13531 1726882411.97059: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13531 1726882411.97142: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13531 1726882411.97560: in VariableManager get_vars() 13531 1726882411.97588: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13531 1726882412.00052: in VariableManager get_vars() 13531 1726882412.00089: done with get_vars() 13531 1726882412.00095: variable 'omit' from source: magic vars 13531 1726882412.00122: variable 'omit' from source: magic vars 13531 1726882412.00162: in VariableManager get_vars() 13531 1726882412.00192: done with get_vars() 13531 1726882412.00214: in VariableManager get_vars() 13531 1726882412.00255: done with get_vars() 13531 1726882412.00288: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13531 1726882412.00415: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13531 1726882412.00494: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13531 1726882412.00919: in VariableManager get_vars() 13531 1726882412.00953: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13531 1726882412.05884: in VariableManager get_vars() 13531 1726882412.05917: done with get_vars() 13531 1726882412.05928: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 13531 1726882412.07135: in VariableManager get_vars() 13531 1726882412.07168: done with get_vars() 13531 1726882412.07346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 13531 1726882412.07360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 13531 1726882412.08302: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 13531 1726882412.08447: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 13531 1726882412.08449: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 13531 1726882412.08482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 13531 1726882412.08508: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 13531 1726882412.08677: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 13531 1726882412.08739: Loaded config def from plugin (callback/default) 13531 1726882412.08741: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13531 1726882412.10211: Loaded config def from plugin (callback/junit) 13531 1726882412.10213: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13531 1726882412.10266: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 13531 1726882412.10337: Loaded config def from plugin (callback/minimal) 13531 1726882412.10339: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13531 1726882412.10379: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13531 1726882412.10433: Loaded config def from plugin (callback/tree) 13531 1726882412.10440: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 13531 1726882412.10539: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 13531 1726882412.10541: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_removal_nm.yml ******************************************** 2 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml 13531 1726882412.10574: in VariableManager get_vars() 13531 1726882412.10589: done with get_vars() 13531 1726882412.10594: in VariableManager get_vars() 13531 1726882412.10602: done with get_vars() 13531 1726882412.10610: variable 'omit' from source: magic vars 13531 1726882412.10650: in VariableManager get_vars() 13531 1726882412.10672: done with get_vars() 13531 1726882412.10695: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_removal.yml' with nm as provider] ***** 13531 1726882412.11313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 13531 1726882412.11387: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 13531 1726882412.12032: getting the remaining hosts for this loop 13531 1726882412.12034: done getting the remaining hosts for this loop 13531 1726882412.12037: getting the next task for host managed_node2 13531 1726882412.12042: done getting next task for host managed_node2 13531 1726882412.12043: ^ task is: TASK: Gathering Facts 13531 1726882412.12045: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882412.12048: getting variables 13531 1726882412.12049: in VariableManager get_vars() 13531 1726882412.12060: Calling all_inventory to load vars for managed_node2 13531 1726882412.12065: Calling groups_inventory to load vars for managed_node2 13531 1726882412.12067: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882412.12082: Calling all_plugins_play to load vars for managed_node2 13531 1726882412.12094: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882412.12098: Calling groups_plugins_play to load vars for managed_node2 13531 1726882412.12132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882412.12192: done with get_vars() 13531 1726882412.12199: done getting variables 13531 1726882412.12271: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:6 Friday 20 September 2024 21:33:32 -0400 (0:00:00.018) 0:00:00.018 ****** 13531 1726882412.12321: entering _queue_task() for managed_node2/gather_facts 13531 1726882412.12323: Creating lock for gather_facts 13531 1726882412.12678: worker is 1 (out of 1 available) 13531 1726882412.13988: exiting _queue_task() for managed_node2/gather_facts 13531 1726882412.14000: done queuing things up, now waiting for results queue to drain 13531 1726882412.14002: waiting for pending results... 13531 1726882412.14021: running TaskExecutor() for managed_node2/TASK: Gathering Facts 13531 1726882412.14479: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001bc 13531 1726882412.14482: variable 'ansible_search_path' from source: unknown 13531 1726882412.14486: calling self._execute() 13531 1726882412.14489: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882412.14491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882412.14493: variable 'omit' from source: magic vars 13531 1726882412.14495: variable 'omit' from source: magic vars 13531 1726882412.14497: variable 'omit' from source: magic vars 13531 1726882412.14499: variable 'omit' from source: magic vars 13531 1726882412.14516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882412.14604: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882412.14623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882412.14687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882412.14697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882412.14726: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882412.14729: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882412.14732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882412.14949: Set connection var ansible_pipelining to False 13531 1726882412.15071: Set connection var ansible_timeout to 10 13531 1726882412.15079: Set connection var ansible_shell_executable to /bin/sh 13531 1726882412.15085: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882412.15088: Set connection var ansible_connection to ssh 13531 1726882412.15090: Set connection var ansible_shell_type to sh 13531 1726882412.15116: variable 'ansible_shell_executable' from source: unknown 13531 1726882412.15119: variable 'ansible_connection' from source: unknown 13531 1726882412.15122: variable 'ansible_module_compression' from source: unknown 13531 1726882412.15124: variable 'ansible_shell_type' from source: unknown 13531 1726882412.15127: variable 'ansible_shell_executable' from source: unknown 13531 1726882412.15129: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882412.15131: variable 'ansible_pipelining' from source: unknown 13531 1726882412.15133: variable 'ansible_timeout' from source: unknown 13531 1726882412.15138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882412.15526: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882412.15536: variable 'omit' from source: magic vars 13531 1726882412.15541: starting attempt loop 13531 1726882412.15544: running the handler 13531 1726882412.15564: variable 'ansible_facts' from source: unknown 13531 1726882412.15585: _low_level_execute_command(): starting 13531 1726882412.15593: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882412.17532: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882412.17536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882412.17557: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882412.17684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882412.17688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882412.17755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882412.17889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882412.18108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882412.19807: stdout chunk (state=3): >>>/root <<< 13531 1726882412.19886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882412.19970: stderr chunk (state=3): >>><<< 13531 1726882412.19973: stdout chunk (state=3): >>><<< 13531 1726882412.20099: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882412.20103: _low_level_execute_command(): starting 13531 1726882412.20106: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882412.2000113-13564-129929136781327 `" && echo ansible-tmp-1726882412.2000113-13564-129929136781327="` echo /root/.ansible/tmp/ansible-tmp-1726882412.2000113-13564-129929136781327 `" ) && sleep 0' 13531 1726882412.21624: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882412.21628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882412.21668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882412.21672: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882412.21681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882412.21857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882412.21862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882412.21868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882412.21973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882412.23858: stdout chunk (state=3): >>>ansible-tmp-1726882412.2000113-13564-129929136781327=/root/.ansible/tmp/ansible-tmp-1726882412.2000113-13564-129929136781327 <<< 13531 1726882412.23969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882412.24058: stderr chunk (state=3): >>><<< 13531 1726882412.24062: stdout chunk (state=3): >>><<< 13531 1726882412.24173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882412.2000113-13564-129929136781327=/root/.ansible/tmp/ansible-tmp-1726882412.2000113-13564-129929136781327 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882412.24176: variable 'ansible_module_compression' from source: unknown 13531 1726882412.24281: ANSIBALLZ: Using generic lock for ansible.legacy.setup 13531 1726882412.24285: ANSIBALLZ: Acquiring lock 13531 1726882412.24287: ANSIBALLZ: Lock acquired: 139969312288320 13531 1726882412.24288: ANSIBALLZ: Creating module 13531 1726882412.63924: ANSIBALLZ: Writing module into payload 13531 1726882412.64122: ANSIBALLZ: Writing module 13531 1726882412.64160: ANSIBALLZ: Renaming module 13531 1726882412.64179: ANSIBALLZ: Done creating module 13531 1726882412.64221: variable 'ansible_facts' from source: unknown 13531 1726882412.64234: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882412.64248: _low_level_execute_command(): starting 13531 1726882412.64259: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 13531 1726882412.65001: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882412.65017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882412.65033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882412.65058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882412.65108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882412.65122: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882412.65137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882412.65162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882412.65178: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882412.65189: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882412.65201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882412.65215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882412.65231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882412.65244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882412.65261: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882412.65278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882412.65355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882412.65386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882412.65405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882412.65549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882412.67221: stdout chunk (state=3): >>>PLATFORM <<< 13531 1726882412.67291: stdout chunk (state=3): >>>Linux <<< 13531 1726882412.67320: stdout chunk (state=3): >>>FOUND <<< 13531 1726882412.67323: stdout chunk (state=3): >>>/usr/bin/python3.9 /usr/bin/python3 <<< 13531 1726882412.67326: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 13531 1726882412.67517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882412.67520: stdout chunk (state=3): >>><<< 13531 1726882412.67526: stderr chunk (state=3): >>><<< 13531 1726882412.67543: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882412.67555 [managed_node2]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 13531 1726882412.67595: _low_level_execute_command(): starting 13531 1726882412.67598: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 13531 1726882412.67684: Sending initial data 13531 1726882412.67687: Sent initial data (1181 bytes) 13531 1726882412.68105: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882412.68108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882412.68145: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882412.68148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882412.68150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882412.68152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882412.68217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882412.68221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882412.68223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882412.68316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882412.72070: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 13531 1726882412.72439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882412.72872: stderr chunk (state=3): >>><<< 13531 1726882412.72876: stdout chunk (state=3): >>><<< 13531 1726882412.72878: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882412.72881: variable 'ansible_facts' from source: unknown 13531 1726882412.72883: variable 'ansible_facts' from source: unknown 13531 1726882412.72885: variable 'ansible_module_compression' from source: unknown 13531 1726882412.72887: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13531 1726882412.72889: variable 'ansible_facts' from source: unknown 13531 1726882412.72890: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882412.2000113-13564-129929136781327/AnsiballZ_setup.py 13531 1726882412.73026: Sending initial data 13531 1726882412.73029: Sent initial data (154 bytes) 13531 1726882412.74019: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882412.74032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882412.74046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882412.74067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882412.74110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882412.74121: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882412.74134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882412.74149: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882412.74160: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882412.74172: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882412.74187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882412.74198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882412.74212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882412.74222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882412.74232: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882412.74243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882412.74333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882412.74355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882412.74373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882412.74500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882412.76259: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882412.76351: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882412.76449: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmptbj99fhc /root/.ansible/tmp/ansible-tmp-1726882412.2000113-13564-129929136781327/AnsiballZ_setup.py <<< 13531 1726882412.76544: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882412.79336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882412.79551: stderr chunk (state=3): >>><<< 13531 1726882412.79555: stdout chunk (state=3): >>><<< 13531 1726882412.79557: done transferring module to remote 13531 1726882412.79559: _low_level_execute_command(): starting 13531 1726882412.79561: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882412.2000113-13564-129929136781327/ /root/.ansible/tmp/ansible-tmp-1726882412.2000113-13564-129929136781327/AnsiballZ_setup.py && sleep 0' 13531 1726882412.80189: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882412.80208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882412.80228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882412.80247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882412.80298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882412.80312: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882412.80327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882412.80350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882412.80363: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882412.80378: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882412.80390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882412.80404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882412.80421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882412.80436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882412.80453: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882412.80470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882412.80546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882412.80575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882412.80592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882412.81020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882412.82498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882412.82580: stderr chunk (state=3): >>><<< 13531 1726882412.82583: stdout chunk (state=3): >>><<< 13531 1726882412.82682: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882412.82685: _low_level_execute_command(): starting 13531 1726882412.82688: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882412.2000113-13564-129929136781327/AnsiballZ_setup.py && sleep 0' 13531 1726882412.83277: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882412.83292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882412.83306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882412.83324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882412.83374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882412.83386: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882412.83400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882412.83418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882412.83430: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882412.83445: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882412.83459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882412.83476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882412.83491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882412.83501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882412.83510: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882412.83520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882412.83589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882412.83609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882412.83623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882412.83767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882412.85741: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # <<< 13531 1726882412.85745: stdout chunk (state=3): >>>import '_weakref' # <<< 13531 1726882412.85796: stdout chunk (state=3): >>>import '_io' # <<< 13531 1726882412.85799: stdout chunk (state=3): >>>import 'marshal' # <<< 13531 1726882412.85824: stdout chunk (state=3): >>>import 'posix' # <<< 13531 1726882412.85856: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 13531 1726882412.85904: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 13531 1726882412.85966: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882412.86000: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 13531 1726882412.86015: stdout chunk (state=3): >>>import '_codecs' # <<< 13531 1726882412.86027: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce373dc0> <<< 13531 1726882412.86069: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 13531 1726882412.86096: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce3183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce373b20> <<< 13531 1726882412.86127: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 13531 1726882412.86130: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce373ac0> <<< 13531 1726882412.86152: stdout chunk (state=3): >>>import '_signal' # <<< 13531 1726882412.86190: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 13531 1726882412.86211: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce318490> <<< 13531 1726882412.86236: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 13531 1726882412.86262: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce318940> <<< 13531 1726882412.86267: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce318670> <<< 13531 1726882412.86302: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 13531 1726882412.86304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 13531 1726882412.86339: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 13531 1726882412.86351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 13531 1726882412.86368: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 13531 1726882412.86391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 13531 1726882412.86410: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2cf190> <<< 13531 1726882412.86439: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 13531 1726882412.86442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 13531 1726882412.86514: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2cf220> <<< 13531 1726882412.86549: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 13531 1726882412.86586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 13531 1726882412.86596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2cf940> <<< 13531 1726882412.86606: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce330880> <<< 13531 1726882412.86636: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2c8d90> <<< 13531 1726882412.86702: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # <<< 13531 1726882412.86705: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2f2d90> <<< 13531 1726882412.86747: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce318970> <<< 13531 1726882412.86781: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13531 1726882412.87113: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 13531 1726882412.87117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 13531 1726882412.87151: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 13531 1726882412.87154: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 13531 1726882412.87188: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 13531 1726882412.87199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 13531 1726882412.87216: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce293eb0> <<< 13531 1726882412.87299: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce296f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 13531 1726882412.87321: stdout chunk (state=3): >>>import '_sre' # <<< 13531 1726882412.87338: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 13531 1726882412.87358: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 13531 1726882412.87387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 13531 1726882412.87414: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce28c610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce292640> <<< 13531 1726882412.87437: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce293370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 13531 1726882412.87519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 13531 1726882412.87533: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 13531 1726882412.87561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882412.87583: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 13531 1726882412.87618: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882412.87648: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cdf4ddc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf4d8b0> import 'itertools' # <<< 13531 1726882412.87670: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf4deb0> <<< 13531 1726882412.87687: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 13531 1726882412.87713: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf4df70> <<< 13531 1726882412.87749: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 13531 1726882412.87766: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf4de80> import '_collections' # <<< 13531 1726882412.87809: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce26ed30> import '_functools' # <<< 13531 1726882412.87839: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce267610> <<< 13531 1726882412.87902: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce27b670> <<< 13531 1726882412.87926: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce29ae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 13531 1726882412.87954: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cdf5fc70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce26e250> <<< 13531 1726882412.88000: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882412.88028: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ce27b280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2a09d0> <<< 13531 1726882412.88051: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 13531 1726882412.88093: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882412.88106: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf5ffa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf5fd90> <<< 13531 1726882412.88151: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf5fd00> <<< 13531 1726882412.88171: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 13531 1726882412.88195: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 13531 1726882412.88213: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 13531 1726882412.88266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 13531 1726882412.88302: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf32370> <<< 13531 1726882412.88323: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 13531 1726882412.88347: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf32460> <<< 13531 1726882412.88475: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf67fa0> <<< 13531 1726882412.88527: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf61a30> <<< 13531 1726882412.88555: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf61490> <<< 13531 1726882412.88558: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 13531 1726882412.88608: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 13531 1726882412.88633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 13531 1726882412.88637: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cde801c0> <<< 13531 1726882412.88662: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf1dc70> <<< 13531 1726882412.88727: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf61eb0> <<< 13531 1726882412.88731: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2a0040> <<< 13531 1726882412.88753: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 13531 1726882412.88793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cde92af0> <<< 13531 1726882412.88821: stdout chunk (state=3): >>>import 'errno' # <<< 13531 1726882412.88862: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde92e20> <<< 13531 1726882412.88880: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 13531 1726882412.88903: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdea4730> <<< 13531 1726882412.88916: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 13531 1726882412.88940: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 13531 1726882412.88976: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdea4c70> <<< 13531 1726882412.89029: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882412.89032: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde323a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cde92f10> <<< 13531 1726882412.89045: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 13531 1726882412.89102: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde42280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdea45b0> <<< 13531 1726882412.89105: stdout chunk (state=3): >>>import 'pwd' # <<< 13531 1726882412.89132: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde42340> <<< 13531 1726882412.89173: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf5f9d0> <<< 13531 1726882412.89207: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 13531 1726882412.89210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 13531 1726882412.89234: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 13531 1726882412.89276: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde5e6a0> <<< 13531 1726882412.89305: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 13531 1726882412.89330: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde5e970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cde5e760> <<< 13531 1726882412.89348: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde5e850> <<< 13531 1726882412.89374: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 13531 1726882412.89611: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde5eca0> <<< 13531 1726882412.89625: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde6a1f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cde5e8e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cde51a30> <<< 13531 1726882412.89654: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf5f5b0> <<< 13531 1726882412.89673: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 13531 1726882412.89726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 13531 1726882412.89756: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cde5ea90> <<< 13531 1726882412.89914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 13531 1726882412.89926: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f22cdd7f670> <<< 13531 1726882412.90161: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 13531 1726882412.90279: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.90323: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 13531 1726882412.90332: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 13531 1726882412.91544: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.92541: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7657c0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd765160> <<< 13531 1726882412.92582: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd765280> <<< 13531 1726882412.92611: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd765f10> <<< 13531 1726882412.92628: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 13531 1726882412.92678: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7654f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd765d30> import 'atexit' # <<< 13531 1726882412.92712: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd765f70> <<< 13531 1726882412.92727: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 13531 1726882412.92751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 13531 1726882412.92795: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd765100> <<< 13531 1726882412.92814: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 13531 1726882412.92835: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 13531 1726882412.92838: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 13531 1726882412.92871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 13531 1726882412.92884: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 13531 1726882412.92978: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd73a130> <<< 13531 1726882412.93005: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd63e0d0> <<< 13531 1726882412.93033: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd63e2b0> <<< 13531 1726882412.93052: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 13531 1726882412.93098: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd63ec40> <<< 13531 1726882412.93110: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd74cdc0> <<< 13531 1726882412.93296: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd74c3a0> <<< 13531 1726882412.93299: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 13531 1726882412.93319: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd74cf70> <<< 13531 1726882412.93340: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 13531 1726882412.93342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 13531 1726882412.93380: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 13531 1726882412.93410: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 13531 1726882412.93418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 13531 1726882412.93432: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd79ac10> <<< 13531 1726882412.93516: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd768cd0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7683a0> <<< 13531 1726882412.93524: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd719b80> <<< 13531 1726882412.93542: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd7684c0> <<< 13531 1726882412.93570: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882412.93590: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7684f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 13531 1726882412.93605: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 13531 1726882412.93616: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 13531 1726882412.93661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 13531 1726882412.93726: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882412.93732: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd69c250> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7ac1f0> <<< 13531 1726882412.93755: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 13531 1726882412.93758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 13531 1726882412.93819: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882412.93822: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd6a98e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7ac370> <<< 13531 1726882412.93833: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 13531 1726882412.93871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882412.93889: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 13531 1726882412.93906: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 13531 1726882412.93955: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7acca0> <<< 13531 1726882412.94085: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd6a9880> <<< 13531 1726882412.94178: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd69c8b0> <<< 13531 1726882412.94211: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882412.94214: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd745190> <<< 13531 1726882412.94244: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882412.94248: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd7ac670> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7a58b0> <<< 13531 1726882412.94276: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 13531 1726882412.94299: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 13531 1726882412.94302: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 13531 1726882412.94348: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd69e9d0> <<< 13531 1726882412.94539: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd6bbb80> <<< 13531 1726882412.94544: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd6a8640> <<< 13531 1726882412.94604: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882412.94610: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd69ef70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd6a8a30> <<< 13531 1726882412.94613: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.94615: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.94618: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 13531 1726882412.94629: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.94690: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.94770: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13531 1726882412.94804: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available <<< 13531 1726882412.94811: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 13531 1726882412.94823: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.94920: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.95017: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.95465: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.95943: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 13531 1726882412.95948: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 13531 1726882412.95960: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882412.96017: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd6e47c0> <<< 13531 1726882412.96091: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd6e9820> <<< 13531 1726882412.96104: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd2669a0> <<< 13531 1726882412.96153: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 13531 1726882412.96156: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.96190: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.96195: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 13531 1726882412.96314: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.96455: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 13531 1726882412.96471: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd723760> # zipimport: zlib available <<< 13531 1726882412.96851: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.97212: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.97262: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.97331: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 13531 1726882412.97368: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.97407: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 13531 1726882412.97410: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.97458: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.97557: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 13531 1726882412.97574: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 13531 1726882412.97583: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.97595: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.97641: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 13531 1726882412.97825: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.98011: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 13531 1726882412.98042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 13531 1726882412.98122: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7673d0> <<< 13531 1726882412.98125: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.98184: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.98268: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 13531 1726882412.98281: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available <<< 13531 1726882412.98308: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.98345: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 13531 1726882412.98386: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.98422: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.98514: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.98572: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 13531 1726882412.98597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882412.98668: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882412.98681: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd6db9a0> <<< 13531 1726882412.98760: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd0e3be0> <<< 13531 1726882412.98807: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py <<< 13531 1726882412.98810: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.98852: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.98909: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.98928: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.98991: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 13531 1726882412.98994: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 13531 1726882412.99006: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 13531 1726882412.99027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 13531 1726882412.99050: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 13531 1726882412.99072: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 13531 1726882412.99147: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd6ec670> <<< 13531 1726882412.99183: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd737d90> <<< 13531 1726882412.99243: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd767400> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 13531 1726882412.99285: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.99300: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 13531 1726882412.99384: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 13531 1726882412.99405: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 13531 1726882412.99461: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.99520: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.99546: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.99550: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.99585: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.99618: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.99651: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.99693: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 13531 1726882412.99697: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.99754: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.99823: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.99838: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882412.99865: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py <<< 13531 1726882412.99895: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.00024: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.00159: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.00198: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.00245: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882413.00274: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 13531 1726882413.00299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 13531 1726882413.00337: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd293ac0> <<< 13531 1726882413.00352: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 13531 1726882413.00384: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 13531 1726882413.00407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 13531 1726882413.00446: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 13531 1726882413.00454: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd247a90> <<< 13531 1726882413.00483: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd247a00> <<< 13531 1726882413.00549: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd27b760> <<< 13531 1726882413.00576: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd293190> <<< 13531 1726882413.00604: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccfe6f10> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccfe6af0> <<< 13531 1726882413.00616: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 13531 1726882413.00652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 13531 1726882413.00656: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 13531 1726882413.00695: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd748cd0> <<< 13531 1726882413.00713: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd234160> <<< 13531 1726882413.00725: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 13531 1726882413.00750: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7482e0> <<< 13531 1726882413.00782: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 13531 1726882413.00795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 13531 1726882413.00822: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd04efa0> <<< 13531 1726882413.00851: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd278dc0> <<< 13531 1726882413.00899: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccfe6dc0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 13531 1726882413.00926: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.00929: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available <<< 13531 1726882413.00976: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.01030: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 13531 1726882413.01077: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.01133: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 13531 1726882413.01152: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 13531 1726882413.01182: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.01216: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 13531 1726882413.01219: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.01255: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.01301: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 13531 1726882413.01340: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.01410: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 13531 1726882413.01413: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.01433: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.01488: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.01528: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.01590: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 13531 1726882413.01981: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.02352: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 13531 1726882413.02389: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.02443: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.02470: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.02512: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 13531 1726882413.02527: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.02553: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.02569: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 13531 1726882413.02619: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.02667: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 13531 1726882413.02691: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.02717: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.02730: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 13531 1726882413.02757: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.02797: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 13531 1726882413.02800: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.02851: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.02942: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 13531 1726882413.02962: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd26b670> <<< 13531 1726882413.02977: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 13531 1726882413.03000: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 13531 1726882413.03161: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccf66f10> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 13531 1726882413.03217: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.03287: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 13531 1726882413.03290: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.03353: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.03434: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 13531 1726882413.03496: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.03570: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 13531 1726882413.03574: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.03596: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.03649: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 13531 1726882413.03665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 13531 1726882413.03811: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ccf59c10> <<< 13531 1726882413.04056: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccfa5b20> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 13531 1726882413.04101: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.04157: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 13531 1726882413.04232: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.04298: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.04395: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.04529: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 13531 1726882413.04573: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.04616: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 13531 1726882413.04620: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.04641: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.04689: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 13531 1726882413.04754: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ccee14f0> <<< 13531 1726882413.04788: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccee1a30> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 13531 1726882413.04791: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.04820: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.04859: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 13531 1726882413.04995: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.05127: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 13531 1726882413.05132: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.05202: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.05288: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.05315: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.05361: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 13531 1726882413.05376: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.05453: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.05478: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.05588: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.05713: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 13531 1726882413.05820: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.05925: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 13531 1726882413.05954: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.05981: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.06425: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.06840: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available <<< 13531 1726882413.06932: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.07034: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 13531 1726882413.07038: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.07107: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.07193: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 13531 1726882413.07323: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.07476: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 13531 1726882413.07490: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 13531 1726882413.07523: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.07578: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 13531 1726882413.07581: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.07651: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.07736: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.07918: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.08098: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 13531 1726882413.08101: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.08122: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.08160: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 13531 1726882413.08194: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.08216: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 13531 1726882413.08277: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.08341: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 13531 1726882413.08371: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.08395: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 13531 1726882413.08440: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.08495: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 13531 1726882413.08546: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.08607: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 13531 1726882413.08610: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.08819: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.09037: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available <<< 13531 1726882413.09086: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.09164: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 13531 1726882413.09196: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13531 1726882413.09218: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 13531 1726882413.09242: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.09296: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 13531 1726882413.09311: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.09314: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.09340: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 13531 1726882413.09431: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.09509: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 13531 1726882413.09528: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 13531 1726882413.09531: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.09558: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.09612: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 13531 1726882413.09625: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13531 1726882413.09651: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.09685: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.09730: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.09783: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.09865: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 13531 1726882413.09899: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.09939: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.10079: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 13531 1726882413.10119: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.10285: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available <<< 13531 1726882413.10325: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.10367: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 13531 1726882413.10386: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.10410: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.10457: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 13531 1726882413.10526: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.10612: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 13531 1726882413.10615: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.10682: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.10752: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 13531 1726882413.10836: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882413.11092: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 13531 1726882413.11106: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 13531 1726882413.11137: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882413.11151: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ccd26070> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccd268b0> <<< 13531 1726882413.11199: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccd26880> <<< 13531 1726882413.12561: stdout chunk (state=3): >>>import 'gc' # <<< 13531 1726882413.18743: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 13531 1726882413.18748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccd26e80> <<< 13531 1726882413.18773: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 13531 1726882413.18785: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccf2ce50> <<< 13531 1726882413.18843: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882413.18897: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cced7340> <<< 13531 1726882413.18900: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cced7a30> <<< 13531 1726882413.19169: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 13531 1726882413.39476: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_loadavg": {"1m": 0.59, "5m": 0.38, "15m": 0.18}, "ansible_pkg_mgr": "dnf", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "33", "epoch": "1726882413", "epoch_int": "1726882413", "date": "2024-09-20", "time": "21:33:33", "iso8601_micro": "2024-09-21T01:33:33.170943Z", "iso8601": "2024-09-21T01:33:33Z", "iso8601_basic": "20240920T213333170943", "iso8601_basic_short": "20240920T213333", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memf<<< 13531 1726882413.39493: stdout chunk (state=3): >>>ree_mb": 2814, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 718, "free": 2814}, "nocache": {"free": 3275, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 352, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241688576, "block_size": 4096, "block_total": 65519355, "block_available": 64512131, "block_used": 1007224, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_iscsi_iqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13531 1726882413.40056: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 <<< 13531 1726882413.40061: stdout chunk (state=3): >>># cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 <<< 13531 1726882413.40093: stdout chunk (state=3): >>># destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform <<< 13531 1726882413.40138: stdout chunk (state=3): >>># cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware <<< 13531 1726882413.40152: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 13531 1726882413.40412: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 13531 1726882413.40434: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 13531 1726882413.40496: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 13531 1726882413.40507: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 13531 1726882413.40539: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid <<< 13531 1726882413.40587: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 13531 1726882413.40648: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing <<< 13531 1726882413.40676: stdout chunk (state=3): >>># destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 13531 1726882413.40720: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction <<< 13531 1726882413.40783: stdout chunk (state=3): >>># destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob <<< 13531 1726882413.40787: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 13531 1726882413.40889: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 13531 1726882413.40984: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 13531 1726882413.41052: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings <<< 13531 1726882413.41070: stdout chunk (state=3): >>># cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 13531 1726882413.41278: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 13531 1726882413.41325: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 13531 1726882413.41359: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 13531 1726882413.41363: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 13531 1726882413.41391: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 13531 1726882413.41828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882413.41831: stdout chunk (state=3): >>><<< 13531 1726882413.41833: stderr chunk (state=3): >>><<< 13531 1726882413.42095: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce373dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce3183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce373b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce373ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce318490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce318940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce318670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce330880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2c8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2f2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce318970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce293eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce296f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce28c610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce292640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce293370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cdf4ddc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf4d8b0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf4deb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf4df70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf4de80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce26ed30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce267610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce27b670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce29ae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cdf5fc70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce26e250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ce27b280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2a09d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf5ffa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf5fd90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf5fd00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf32370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf32460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf67fa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf61a30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf61490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cde801c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf1dc70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf61eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ce2a0040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cde92af0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde92e20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdea4730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdea4c70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde323a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cde92f10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde42280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdea45b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde42340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf5f9d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde5e6a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde5e970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cde5e760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde5e850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde5eca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cde6a1f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cde5e8e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cde51a30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cdf5f5b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cde5ea90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f22cdd7f670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7657c0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd765160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd765280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd765f10> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7654f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd765d30> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd765f70> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd765100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd73a130> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd63e0d0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd63e2b0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd63ec40> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd74cdc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd74c3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd74cf70> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd79ac10> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd768cd0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7683a0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd719b80> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd7684c0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7684f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd69c250> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7ac1f0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd6a98e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7ac370> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7acca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd6a9880> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd69c8b0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd745190> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd7ac670> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7a58b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd69e9d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd6bbb80> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd6a8640> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd69ef70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd6a8a30> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd6e47c0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd6e9820> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd2669a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd723760> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7673d0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd6db9a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd0e3be0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd6ec670> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd737d90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd767400> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd293ac0> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd247a90> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd247a00> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd27b760> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd293190> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccfe6f10> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccfe6af0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd748cd0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd234160> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd7482e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22cd04efa0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd278dc0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccfe6dc0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cd26b670> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccf66f10> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ccf59c10> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccfa5b20> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ccee14f0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccee1a30> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_bix03mhm/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ccd26070> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccd268b0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccd26880> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccd26e80> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ccf2ce50> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cced7340> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22cced7a30> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_loadavg": {"1m": 0.59, "5m": 0.38, "15m": 0.18}, "ansible_pkg_mgr": "dnf", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "33", "epoch": "1726882413", "epoch_int": "1726882413", "date": "2024-09-20", "time": "21:33:33", "iso8601_micro": "2024-09-21T01:33:33.170943Z", "iso8601": "2024-09-21T01:33:33Z", "iso8601_basic": "20240920T213333170943", "iso8601_basic_short": "20240920T213333", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2814, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 718, "free": 2814}, "nocache": {"free": 3275, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 352, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241688576, "block_size": 4096, "block_total": 65519355, "block_available": 64512131, "block_used": 1007224, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_iscsi_iqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 13531 1726882413.43472: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882412.2000113-13564-129929136781327/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882413.43490: _low_level_execute_command(): starting 13531 1726882413.43496: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882412.2000113-13564-129929136781327/ > /dev/null 2>&1 && sleep 0' 13531 1726882413.45353: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882413.45446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882413.45466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882413.45488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882413.45531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882413.45599: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882413.45614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882413.45632: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882413.45644: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882413.45656: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882413.45670: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882413.45683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882413.45700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882413.45711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882413.45720: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882413.45735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882413.45855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882413.45938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882413.45954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882413.46101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882413.48014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882413.48078: stderr chunk (state=3): >>><<< 13531 1726882413.48082: stdout chunk (state=3): >>><<< 13531 1726882413.48375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882413.48380: handler run complete 13531 1726882413.48382: variable 'ansible_facts' from source: unknown 13531 1726882413.48431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882413.48908: variable 'ansible_facts' from source: unknown 13531 1726882413.49306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882413.49709: attempt loop complete, returning result 13531 1726882413.49851: _execute() done 13531 1726882413.49881: dumping result to json 13531 1726882413.49916: done dumping result, returning 13531 1726882413.49928: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-4fd9-519d-0000000001bc] 13531 1726882413.49939: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001bc ok: [managed_node2] 13531 1726882413.52988: no more pending results, returning what we have 13531 1726882413.52992: results queue empty 13531 1726882413.52993: checking for any_errors_fatal 13531 1726882413.52995: done checking for any_errors_fatal 13531 1726882413.52995: checking for max_fail_percentage 13531 1726882413.52997: done checking for max_fail_percentage 13531 1726882413.52998: checking to see if all hosts have failed and the running result is not ok 13531 1726882413.52999: done checking to see if all hosts have failed 13531 1726882413.53000: getting the remaining hosts for this loop 13531 1726882413.53002: done getting the remaining hosts for this loop 13531 1726882413.53005: getting the next task for host managed_node2 13531 1726882413.53012: done getting next task for host managed_node2 13531 1726882413.53014: ^ task is: TASK: meta (flush_handlers) 13531 1726882413.53016: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882413.53021: getting variables 13531 1726882413.53023: in VariableManager get_vars() 13531 1726882413.53048: Calling all_inventory to load vars for managed_node2 13531 1726882413.53050: Calling groups_inventory to load vars for managed_node2 13531 1726882413.53054: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882413.53070: Calling all_plugins_play to load vars for managed_node2 13531 1726882413.53073: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882413.53076: Calling groups_plugins_play to load vars for managed_node2 13531 1726882413.53262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882413.53762: done with get_vars() 13531 1726882413.53775: done getting variables 13531 1726882413.53873: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001bc 13531 1726882413.53876: WORKER PROCESS EXITING 13531 1726882413.53920: in VariableManager get_vars() 13531 1726882413.53930: Calling all_inventory to load vars for managed_node2 13531 1726882413.53932: Calling groups_inventory to load vars for managed_node2 13531 1726882413.53934: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882413.53939: Calling all_plugins_play to load vars for managed_node2 13531 1726882413.53941: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882413.53948: Calling groups_plugins_play to load vars for managed_node2 13531 1726882413.54144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882413.54331: done with get_vars() 13531 1726882413.54346: done queuing things up, now waiting for results queue to drain 13531 1726882413.54348: results queue empty 13531 1726882413.54349: checking for any_errors_fatal 13531 1726882413.54352: done checking for any_errors_fatal 13531 1726882413.54353: checking for max_fail_percentage 13531 1726882413.54354: done checking for max_fail_percentage 13531 1726882413.54354: checking to see if all hosts have failed and the running result is not ok 13531 1726882413.54355: done checking to see if all hosts have failed 13531 1726882413.54356: getting the remaining hosts for this loop 13531 1726882413.54357: done getting the remaining hosts for this loop 13531 1726882413.54359: getting the next task for host managed_node2 13531 1726882413.54589: done getting next task for host managed_node2 13531 1726882413.54593: ^ task is: TASK: Include the task 'el_repo_setup.yml' 13531 1726882413.54595: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882413.54597: getting variables 13531 1726882413.54598: in VariableManager get_vars() 13531 1726882413.54607: Calling all_inventory to load vars for managed_node2 13531 1726882413.54609: Calling groups_inventory to load vars for managed_node2 13531 1726882413.54612: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882413.54616: Calling all_plugins_play to load vars for managed_node2 13531 1726882413.54619: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882413.54622: Calling groups_plugins_play to load vars for managed_node2 13531 1726882413.54776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882413.54957: done with get_vars() 13531 1726882413.54967: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:11 Friday 20 September 2024 21:33:33 -0400 (0:00:01.427) 0:00:01.445 ****** 13531 1726882413.55039: entering _queue_task() for managed_node2/include_tasks 13531 1726882413.55041: Creating lock for include_tasks 13531 1726882413.55427: worker is 1 (out of 1 available) 13531 1726882413.55444: exiting _queue_task() for managed_node2/include_tasks 13531 1726882413.55457: done queuing things up, now waiting for results queue to drain 13531 1726882413.55459: waiting for pending results... 13531 1726882413.55840: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 13531 1726882413.56130: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000006 13531 1726882413.56160: variable 'ansible_search_path' from source: unknown 13531 1726882413.56203: calling self._execute() 13531 1726882413.56291: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882413.56304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882413.56317: variable 'omit' from source: magic vars 13531 1726882413.56426: _execute() done 13531 1726882413.56477: dumping result to json 13531 1726882413.56485: done dumping result, returning 13531 1726882413.56496: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-4fd9-519d-000000000006] 13531 1726882413.56591: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000006 13531 1726882413.56734: no more pending results, returning what we have 13531 1726882413.56740: in VariableManager get_vars() 13531 1726882413.56776: Calling all_inventory to load vars for managed_node2 13531 1726882413.56779: Calling groups_inventory to load vars for managed_node2 13531 1726882413.56783: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882413.56798: Calling all_plugins_play to load vars for managed_node2 13531 1726882413.56801: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882413.56804: Calling groups_plugins_play to load vars for managed_node2 13531 1726882413.56993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882413.57199: done with get_vars() 13531 1726882413.57206: variable 'ansible_search_path' from source: unknown 13531 1726882413.57224: we have included files to process 13531 1726882413.57225: generating all_blocks data 13531 1726882413.57227: done generating all_blocks data 13531 1726882413.57227: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13531 1726882413.57229: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13531 1726882413.57232: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13531 1726882413.57893: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000006 13531 1726882413.57897: WORKER PROCESS EXITING 13531 1726882413.59214: in VariableManager get_vars() 13531 1726882413.59230: done with get_vars() 13531 1726882413.59241: done processing included file 13531 1726882413.59243: iterating over new_blocks loaded from include file 13531 1726882413.59244: in VariableManager get_vars() 13531 1726882413.59252: done with get_vars() 13531 1726882413.59254: filtering new block on tags 13531 1726882413.59448: done filtering new block on tags 13531 1726882413.59451: in VariableManager get_vars() 13531 1726882413.59463: done with get_vars() 13531 1726882413.59467: filtering new block on tags 13531 1726882413.59484: done filtering new block on tags 13531 1726882413.59487: in VariableManager get_vars() 13531 1726882413.59498: done with get_vars() 13531 1726882413.59499: filtering new block on tags 13531 1726882413.59512: done filtering new block on tags 13531 1726882413.59514: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 13531 1726882413.59520: extending task lists for all hosts with included blocks 13531 1726882413.59570: done extending task lists 13531 1726882413.59572: done processing included files 13531 1726882413.59572: results queue empty 13531 1726882413.59573: checking for any_errors_fatal 13531 1726882413.59575: done checking for any_errors_fatal 13531 1726882413.59576: checking for max_fail_percentage 13531 1726882413.59577: done checking for max_fail_percentage 13531 1726882413.59578: checking to see if all hosts have failed and the running result is not ok 13531 1726882413.59578: done checking to see if all hosts have failed 13531 1726882413.59579: getting the remaining hosts for this loop 13531 1726882413.59580: done getting the remaining hosts for this loop 13531 1726882413.59583: getting the next task for host managed_node2 13531 1726882413.59586: done getting next task for host managed_node2 13531 1726882413.59588: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 13531 1726882413.59591: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882413.59593: getting variables 13531 1726882413.59594: in VariableManager get_vars() 13531 1726882413.59602: Calling all_inventory to load vars for managed_node2 13531 1726882413.59604: Calling groups_inventory to load vars for managed_node2 13531 1726882413.59606: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882413.59611: Calling all_plugins_play to load vars for managed_node2 13531 1726882413.59613: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882413.59616: Calling groups_plugins_play to load vars for managed_node2 13531 1726882413.59970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882413.60160: done with get_vars() 13531 1726882413.60170: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:33:33 -0400 (0:00:00.055) 0:00:01.501 ****** 13531 1726882413.60627: entering _queue_task() for managed_node2/setup 13531 1726882413.61130: worker is 1 (out of 1 available) 13531 1726882413.61139: exiting _queue_task() for managed_node2/setup 13531 1726882413.61150: done queuing things up, now waiting for results queue to drain 13531 1726882413.61152: waiting for pending results... 13531 1726882413.61979: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 13531 1726882413.62120: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001cd 13531 1726882413.62256: variable 'ansible_search_path' from source: unknown 13531 1726882413.62294: variable 'ansible_search_path' from source: unknown 13531 1726882413.62342: calling self._execute() 13531 1726882413.62612: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882413.62679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882413.62693: variable 'omit' from source: magic vars 13531 1726882413.63512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882413.69410: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882413.69485: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882413.69639: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882413.69680: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882413.69737: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882413.69896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882413.70067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882413.70099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882413.70148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882413.70170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882413.70542: variable 'ansible_facts' from source: unknown 13531 1726882413.70751: variable 'network_test_required_facts' from source: task vars 13531 1726882413.70907: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 13531 1726882413.70920: variable 'omit' from source: magic vars 13531 1726882413.70962: variable 'omit' from source: magic vars 13531 1726882413.71005: variable 'omit' from source: magic vars 13531 1726882413.71149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882413.71182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882413.71204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882413.71228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882413.71242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882413.71367: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882413.71377: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882413.71384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882413.71603: Set connection var ansible_pipelining to False 13531 1726882413.71613: Set connection var ansible_timeout to 10 13531 1726882413.71622: Set connection var ansible_shell_executable to /bin/sh 13531 1726882413.71632: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882413.71638: Set connection var ansible_connection to ssh 13531 1726882413.71643: Set connection var ansible_shell_type to sh 13531 1726882413.71679: variable 'ansible_shell_executable' from source: unknown 13531 1726882413.71777: variable 'ansible_connection' from source: unknown 13531 1726882413.71784: variable 'ansible_module_compression' from source: unknown 13531 1726882413.71790: variable 'ansible_shell_type' from source: unknown 13531 1726882413.71797: variable 'ansible_shell_executable' from source: unknown 13531 1726882413.71802: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882413.71810: variable 'ansible_pipelining' from source: unknown 13531 1726882413.71816: variable 'ansible_timeout' from source: unknown 13531 1726882413.71823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882413.72079: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882413.72208: variable 'omit' from source: magic vars 13531 1726882413.72220: starting attempt loop 13531 1726882413.72226: running the handler 13531 1726882413.72244: _low_level_execute_command(): starting 13531 1726882413.72255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882413.74325: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882413.74329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882413.74357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882413.74360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882413.74365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882413.74554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882413.74557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882413.74559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882413.74744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882413.76328: stdout chunk (state=3): >>>/root <<< 13531 1726882413.76426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882413.76510: stderr chunk (state=3): >>><<< 13531 1726882413.76514: stdout chunk (state=3): >>><<< 13531 1726882413.76625: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882413.76637: _low_level_execute_command(): starting 13531 1726882413.76640: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882413.7653346-13645-159746643397705 `" && echo ansible-tmp-1726882413.7653346-13645-159746643397705="` echo /root/.ansible/tmp/ansible-tmp-1726882413.7653346-13645-159746643397705 `" ) && sleep 0' 13531 1726882413.78159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882413.78162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882413.78199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882413.78203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882413.78206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882413.78392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882413.78395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882413.78574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882413.80369: stdout chunk (state=3): >>>ansible-tmp-1726882413.7653346-13645-159746643397705=/root/.ansible/tmp/ansible-tmp-1726882413.7653346-13645-159746643397705 <<< 13531 1726882413.80474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882413.80548: stderr chunk (state=3): >>><<< 13531 1726882413.80551: stdout chunk (state=3): >>><<< 13531 1726882413.80870: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882413.7653346-13645-159746643397705=/root/.ansible/tmp/ansible-tmp-1726882413.7653346-13645-159746643397705 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882413.80874: variable 'ansible_module_compression' from source: unknown 13531 1726882413.80877: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13531 1726882413.80879: variable 'ansible_facts' from source: unknown 13531 1726882413.80911: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882413.7653346-13645-159746643397705/AnsiballZ_setup.py 13531 1726882413.81568: Sending initial data 13531 1726882413.81572: Sent initial data (154 bytes) 13531 1726882413.84050: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882413.84069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882413.84084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882413.84102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882413.84150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882413.84165: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882413.84180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882413.84197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882413.84210: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882413.84221: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882413.84237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882413.84358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882413.84379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882413.84392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882413.84404: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882413.84418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882413.84499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882413.84523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882413.84545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882413.84686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882413.86495: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882413.86593: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882413.86694: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmppo2aajyr /root/.ansible/tmp/ansible-tmp-1726882413.7653346-13645-159746643397705/AnsiballZ_setup.py <<< 13531 1726882413.86791: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882413.90030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882413.90165: stderr chunk (state=3): >>><<< 13531 1726882413.90169: stdout chunk (state=3): >>><<< 13531 1726882413.90172: done transferring module to remote 13531 1726882413.90174: _low_level_execute_command(): starting 13531 1726882413.90177: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882413.7653346-13645-159746643397705/ /root/.ansible/tmp/ansible-tmp-1726882413.7653346-13645-159746643397705/AnsiballZ_setup.py && sleep 0' 13531 1726882413.91609: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882413.91701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882413.91717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882413.91735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882413.91780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882413.91794: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882413.91809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882413.91826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882413.91837: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882413.91847: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882413.91859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882413.91876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882413.91891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882413.91917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882413.91928: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882413.91942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882413.92022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882413.92149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882413.92169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882413.92299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882413.94192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882413.94195: stdout chunk (state=3): >>><<< 13531 1726882413.94198: stderr chunk (state=3): >>><<< 13531 1726882413.94290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882413.94293: _low_level_execute_command(): starting 13531 1726882413.94296: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882413.7653346-13645-159746643397705/AnsiballZ_setup.py && sleep 0' 13531 1726882413.95723: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882413.95849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882413.95863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882413.95881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882413.95920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882413.95927: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882413.95937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882413.95958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882413.95968: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882413.95975: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882413.95983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882413.95992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882413.96003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882413.96010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882413.96018: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882413.96026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882413.96106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882413.96175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882413.96187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882413.96380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882413.98309: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 13531 1726882413.98376: stdout chunk (state=3): >>>import '_io' # <<< 13531 1726882413.98379: stdout chunk (state=3): >>>import 'marshal' # <<< 13531 1726882413.98401: stdout chunk (state=3): >>>import 'posix' # <<< 13531 1726882413.98436: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 13531 1726882413.98481: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 13531 1726882413.98541: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882413.98569: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 13531 1726882413.98582: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 13531 1726882413.98604: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7b1edc0> <<< 13531 1726882413.98641: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 13531 1726882413.98679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7ac33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7b1eb20> <<< 13531 1726882413.98704: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7b1eac0> <<< 13531 1726882413.98738: stdout chunk (state=3): >>>import '_signal' # <<< 13531 1726882413.98758: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7ac3490> <<< 13531 1726882413.98819: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 13531 1726882413.98822: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 13531 1726882413.98843: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7ac3940> <<< 13531 1726882413.98854: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7ac3670> <<< 13531 1726882413.98886: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 13531 1726882413.98913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 13531 1726882413.98929: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 13531 1726882413.98962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 13531 1726882413.98966: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 13531 1726882413.99007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 13531 1726882413.99011: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7a7a190> <<< 13531 1726882413.99037: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 13531 1726882413.99040: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 13531 1726882413.99123: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7a7a220> <<< 13531 1726882413.99168: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 13531 1726882413.99183: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7a9d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7a7a940> <<< 13531 1726882413.99205: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7adb880> <<< 13531 1726882413.99239: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7a73d90> <<< 13531 1726882413.99310: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 13531 1726882413.99313: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7a9dd90> <<< 13531 1726882413.99361: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7ac3970> <<< 13531 1726882413.99391: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13531 1726882413.99728: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 13531 1726882413.99731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 13531 1726882413.99769: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 13531 1726882413.99789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 13531 1726882413.99818: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 13531 1726882413.99839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 13531 1726882413.99842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77d3eb0> <<< 13531 1726882413.99893: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77d5f40> <<< 13531 1726882413.99931: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 13531 1726882413.99943: stdout chunk (state=3): >>>import '_sre' # <<< 13531 1726882413.99979: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 13531 1726882414.00003: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 13531 1726882414.00007: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 13531 1726882414.00049: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77cc610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77d2640> <<< 13531 1726882414.00053: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77d3370> <<< 13531 1726882414.00068: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 13531 1726882414.00142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 13531 1726882414.00154: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 13531 1726882414.00199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882414.00229: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 13531 1726882414.00232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 13531 1726882414.00275: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef768ee20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef768e910> import 'itertools' # <<< 13531 1726882414.00309: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef768ef10> <<< 13531 1726882414.00313: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 13531 1726882414.00335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 13531 1726882414.00369: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef768efd0> <<< 13531 1726882414.00391: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a10d0> import '_collections' # <<< 13531 1726882414.00449: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77aed90> <<< 13531 1726882414.00452: stdout chunk (state=3): >>>import '_functools' # <<< 13531 1726882414.00478: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77a7670> <<< 13531 1726882414.00546: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77b96d0> <<< 13531 1726882414.00550: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77dae20> <<< 13531 1726882414.00562: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 13531 1726882414.00596: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef76a1cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77ae2b0> <<< 13531 1726882414.00652: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882414.00667: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef77b92e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77e09d0> <<< 13531 1726882414.00703: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 13531 1726882414.00720: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882414.00746: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a1eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a1df0> <<< 13531 1726882414.00775: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a1d60> <<< 13531 1726882414.00804: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 13531 1726882414.00839: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 13531 1726882414.00842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 13531 1726882414.00862: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 13531 1726882414.00909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 13531 1726882414.00948: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76743d0> <<< 13531 1726882414.00969: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 13531 1726882414.00972: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 13531 1726882414.01004: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76744c0> <<< 13531 1726882414.01127: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a9f40> <<< 13531 1726882414.01171: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a3a90> <<< 13531 1726882414.01180: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a3490> <<< 13531 1726882414.01203: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 13531 1726882414.01255: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 13531 1726882414.01258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 13531 1726882414.01300: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75c2220> <<< 13531 1726882414.01325: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef765f520> <<< 13531 1726882414.01384: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a3f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77e0040> <<< 13531 1726882414.01416: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 13531 1726882414.01440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 13531 1726882414.01470: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75d4b50> import 'errno' # <<< 13531 1726882414.01512: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef75d4e80> <<< 13531 1726882414.01545: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 13531 1726882414.01566: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75e5790> <<< 13531 1726882414.01593: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 13531 1726882414.01624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 13531 1726882414.01652: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75e5cd0> <<< 13531 1726882414.01710: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef757e400> <<< 13531 1726882414.01713: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75d4f70> <<< 13531 1726882414.01728: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 13531 1726882414.01788: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef758f2e0> <<< 13531 1726882414.01791: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75e5610> import 'pwd' # <<< 13531 1726882414.01813: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef758f3a0> <<< 13531 1726882414.01859: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a1a30> <<< 13531 1726882414.01884: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 13531 1726882414.01910: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 13531 1726882414.01923: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 13531 1726882414.01952: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef75aa700> <<< 13531 1726882414.01980: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 13531 1726882414.02013: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef75aa9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75aa7c0> <<< 13531 1726882414.02034: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef75aa8b0> <<< 13531 1726882414.02060: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 13531 1726882414.02261: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef75aad00> <<< 13531 1726882414.02299: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef75b5250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75aa940> <<< 13531 1726882414.02321: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef759ea90> <<< 13531 1726882414.02338: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a1610> <<< 13531 1726882414.02361: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 13531 1726882414.02417: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 13531 1726882414.02457: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75aaaf0> <<< 13531 1726882414.02604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 13531 1726882414.02618: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9ef6fe76d0> <<< 13531 1726882414.02841: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip' # zipimport: zlib available <<< 13531 1726882414.02940: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.02986: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 13531 1726882414.03015: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 13531 1726882414.03018: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.04205: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.05125: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f23820> <<< 13531 1726882414.05161: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 13531 1726882414.05214: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py <<< 13531 1726882414.05218: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 13531 1726882414.05230: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6f23160> <<< 13531 1726882414.05265: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f23280> <<< 13531 1726882414.05290: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f23f70> <<< 13531 1726882414.05315: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 13531 1726882414.05370: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f234f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f23d90> import 'atexit' # <<< 13531 1726882414.05409: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6f23fd0> <<< 13531 1726882414.05412: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 13531 1726882414.05439: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 13531 1726882414.05487: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f23100> <<< 13531 1726882414.05506: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 13531 1726882414.05535: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 13531 1726882414.05548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 13531 1726882414.05577: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 13531 1726882414.05658: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6efa0d0> <<< 13531 1726882414.05709: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6dff310> <<< 13531 1726882414.05739: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6dff160> <<< 13531 1726882414.05752: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 13531 1726882414.05788: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6dffca0> <<< 13531 1726882414.05804: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f0adc0> <<< 13531 1726882414.05972: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f0a3a0> <<< 13531 1726882414.05995: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 13531 1726882414.06042: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f0afd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 13531 1726882414.06053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 13531 1726882414.06083: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 13531 1726882414.06152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 13531 1726882414.06155: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f5ad30> <<< 13531 1726882414.06277: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6ef8d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6ef8400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6ed8b20> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6ef8520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6ef8550> <<< 13531 1726882414.06300: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 13531 1726882414.06581: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 13531 1726882414.06596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6e6afd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f6c250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6e67850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f6c3d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 13531 1726882414.06646: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f6cca0> <<< 13531 1726882414.06783: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6e677f0> <<< 13531 1726882414.06871: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6f04c10> <<< 13531 1726882414.06903: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6f6cfa0> <<< 13531 1726882414.06937: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6f6c550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f65910> <<< 13531 1726882414.06977: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 13531 1726882414.07004: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 13531 1726882414.07041: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6e5d940> <<< 13531 1726882414.07253: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6e7ad90> <<< 13531 1726882414.07286: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6e66580> <<< 13531 1726882414.07306: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6e5dee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6e669a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 13531 1726882414.07386: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.07490: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 13531 1726882414.07510: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 13531 1726882414.07612: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.07704: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.08161: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.08625: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 13531 1726882414.08656: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 13531 1726882414.08666: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882414.08706: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6e797f0> <<< 13531 1726882414.08796: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6eb48b0> <<< 13531 1726882414.08800: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef69ed970> <<< 13531 1726882414.08844: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 13531 1726882414.08874: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.08888: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.08891: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 13531 1726882414.09004: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.09134: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 13531 1726882414.09161: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6ee0730> <<< 13531 1726882414.09166: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.09543: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.09904: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.09954: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.10022: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 13531 1726882414.10057: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.10098: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 13531 1726882414.10103: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.10146: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.10253: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 13531 1726882414.10653: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available <<< 13531 1726882414.10700: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 13531 1726882414.10733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 13531 1726882414.10811: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f25370> <<< 13531 1726882414.10814: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.10871: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.10953: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 13531 1726882414.10957: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 13531 1726882414.10984: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.10997: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.11041: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 13531 1726882414.11044: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.11084: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.11113: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.11203: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.11263: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 13531 1726882414.11286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882414.11354: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6e97550> <<< 13531 1726882414.11447: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef687f160> <<< 13531 1726882414.11484: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 13531 1726882414.11541: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.11592: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.11622: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.11650: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 13531 1726882414.11679: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 13531 1726882414.11729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 13531 1726882414.11757: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 13531 1726882414.11771: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 13531 1726882414.11834: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6e9a910> <<< 13531 1726882414.11879: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6e9b790> <<< 13531 1726882414.11932: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6e97b50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 13531 1726882414.11983: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.11995: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 13531 1726882414.12055: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 13531 1726882414.12098: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 13531 1726882414.12150: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.12215: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.12234: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.12245: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.12278: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.12316: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.12341: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.12380: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 13531 1726882414.12392: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.12441: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.12523: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.12553: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.12567: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 13531 1726882414.12711: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.12849: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.12882: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.12932: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882414.12975: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 13531 1726882414.12980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 13531 1726882414.12982: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 13531 1726882414.13009: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef69b2370> <<< 13531 1726882414.13030: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 13531 1726882414.13036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 13531 1726882414.13058: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 13531 1726882414.13095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 13531 1726882414.13104: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 13531 1726882414.13117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 13531 1726882414.13124: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef69cb580> <<< 13531 1726882414.13169: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882414.13172: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef69cb4f0> <<< 13531 1726882414.13234: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef69a3280> <<< 13531 1726882414.13240: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef69b2970> <<< 13531 1726882414.13269: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef67697f0> <<< 13531 1726882414.13281: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6769b20> <<< 13531 1726882414.13287: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 13531 1726882414.13317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 13531 1726882414.13329: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 13531 1726882414.13334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 13531 1726882414.13374: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882414.13377: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6a130a0> <<< 13531 1726882414.13380: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef69b90a0> <<< 13531 1726882414.13402: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 13531 1726882414.13412: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 13531 1726882414.13442: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6a13190> <<< 13531 1726882414.13451: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 13531 1726882414.13477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 13531 1726882414.13501: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882414.13507: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef67d2fd0> <<< 13531 1726882414.13528: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef69fd820> <<< 13531 1726882414.13567: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6769d60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 13531 1726882414.13571: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 13531 1726882414.13587: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.13590: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 13531 1726882414.13612: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.13666: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.13715: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 13531 1726882414.13766: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.13808: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 13531 1726882414.13820: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.13823: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.13832: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 13531 1726882414.13838: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.13883: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.13895: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 13531 1726882414.13900: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.13941: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.13986: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 13531 1726882414.13992: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.14029: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.14066: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 13531 1726882414.14075: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.14124: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.14179: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.14217: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.14275: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 13531 1726882414.14281: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.14673: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.15024: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 13531 1726882414.15076: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.15117: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.15147: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.15176: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py <<< 13531 1726882414.15185: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available <<< 13531 1726882414.15214: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.15238: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 13531 1726882414.15244: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.15295: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.15343: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 13531 1726882414.15349: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.15382: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.15406: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 13531 1726882414.15409: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.15433: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.15468: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 13531 1726882414.15475: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.15529: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.15609: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 13531 1726882414.15639: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef66b5e80> <<< 13531 1726882414.15645: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 13531 1726882414.15678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 13531 1726882414.15827: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef66b59d0> <<< 13531 1726882414.15833: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 13531 1726882414.15899: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.15950: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 13531 1726882414.15961: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.16028: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.16108: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 13531 1726882414.16171: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.16231: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 13531 1726882414.16237: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.16273: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.16315: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 13531 1726882414.16333: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 13531 1726882414.16481: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882414.16487: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6732490> <<< 13531 1726882414.16724: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef66b7850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 13531 1726882414.16730: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.16774: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.16827: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 13531 1726882414.16901: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.16970: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.17068: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.17191: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/compat/version.py <<< 13531 1726882414.17198: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 13531 1726882414.17234: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.17268: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 13531 1726882414.17275: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.17303: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.17349: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 13531 1726882414.17396: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6730670> <<< 13531 1726882414.17418: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6730220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available <<< 13531 1726882414.17421: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 13531 1726882414.17440: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.17475: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.17512: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 13531 1726882414.17518: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.17651: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.17778: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 13531 1726882414.17784: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.17864: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.17942: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.17978: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.18015: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 13531 1726882414.18028: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.18102: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.18125: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.18237: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.18370: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 13531 1726882414.18377: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.18463: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.18569: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 13531 1726882414.18598: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.18628: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.19053: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.19473: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py <<< 13531 1726882414.19476: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available <<< 13531 1726882414.19555: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.19651: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 13531 1726882414.19654: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.19731: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.19814: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 13531 1726882414.19948: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.20093: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available <<< 13531 1726882414.20107: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 13531 1726882414.20140: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.20195: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 13531 1726882414.20199: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.20271: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.20350: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.20520: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.20698: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 13531 1726882414.20701: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.20728: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.20766: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 13531 1726882414.20777: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.20794: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.20816: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 13531 1726882414.20875: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.20953: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 13531 1726882414.20981: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.20993: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 13531 1726882414.21035: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.21092: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 13531 1726882414.21142: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.21202: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 13531 1726882414.21205: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.21413: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.21629: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available <<< 13531 1726882414.21685: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.21742: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 13531 1726882414.21745: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.21768: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.21804: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 13531 1726882414.21839: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.21867: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 13531 1726882414.21886: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.21898: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.21939: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 13531 1726882414.22003: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.22085: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 13531 1726882414.22107: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available <<< 13531 1726882414.22151: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.22200: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 13531 1726882414.22222: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.22226: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.22237: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.22275: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.22317: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.22371: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.22442: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 13531 1726882414.22448: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.22492: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.22539: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 13531 1726882414.22546: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.22706: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.22867: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available <<< 13531 1726882414.22910: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.22950: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 13531 1726882414.22956: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.22994: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.23041: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 13531 1726882414.23048: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.23109: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.23187: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 13531 1726882414.23193: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.23268: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.23337: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 13531 1726882414.23344: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 13531 1726882414.23417: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.24157: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 13531 1726882414.24166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 13531 1726882414.24179: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 13531 1726882414.24195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 13531 1726882414.24235: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6674100> <<< 13531 1726882414.24242: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef66c89a0> <<< 13531 1726882414.24300: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef66c8ee0> <<< 13531 1726882414.25547: stdout chunk (state=3): >>>import 'gc' # <<< 13531 1726882414.25937: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LES<<< 13531 1726882414.25947: stdout chunk (state=3): >>>SOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "34", "epoch": "1726882414", "epoch_int": "1726882414", "date": "2024-09-20", "time": "21:33:34", "iso8601_micro": "2024-09-21T01:33:34.256750Z", "iso8601": "2024-09-21T01:33:34Z", "iso8601_basic": "20240920T213334256750", "iso8601_basic_short": "20240920T213334", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13531 1726882414.26458: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal<<< 13531 1726882414.26548: stdout chunk (state=3): >>> # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator<<< 13531 1726882414.26608: stdout chunk (state=3): >>> # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib <<< 13531 1726882414.26667: stdout chunk (state=3): >>># cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array <<< 13531 1726882414.26729: stdout chunk (state=3): >>># cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast <<< 13531 1726882414.26806: stdout chunk (state=3): >>># destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai <<< 13531 1726882414.26830: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin <<< 13531 1726882414.26843: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd <<< 13531 1726882414.26916: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos <<< 13531 1726882414.26947: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl <<< 13531 1726882414.26960: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc <<< 13531 1726882414.27241: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 13531 1726882414.27259: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 13531 1726882414.27290: stdout chunk (state=3): >>># destroy zipimport <<< 13531 1726882414.27302: stdout chunk (state=3): >>># destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 13531 1726882414.27349: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 13531 1726882414.27358: stdout chunk (state=3): >>># destroy encodings <<< 13531 1726882414.27373: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 13531 1726882414.27417: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 13531 1726882414.27467: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 13531 1726882414.27484: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 13531 1726882414.27500: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 13531 1726882414.27521: stdout chunk (state=3): >>># destroy shlex # destroy datetime <<< 13531 1726882414.27537: stdout chunk (state=3): >>># destroy base64 <<< 13531 1726882414.27562: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 13531 1726882414.27596: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 13531 1726882414.27603: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 13531 1726882414.27640: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep <<< 13531 1726882414.27685: stdout chunk (state=3): >>># cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 13531 1726882414.27712: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 13531 1726882414.27750: stdout chunk (state=3): >>># cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 13531 1726882414.27796: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re <<< 13531 1726882414.27811: stdout chunk (state=3): >>># destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath <<< 13531 1726882414.27851: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 13531 1726882414.27865: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 13531 1726882414.27904: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios <<< 13531 1726882414.27912: stdout chunk (state=3): >>># destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 13531 1726882414.28084: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 13531 1726882414.28117: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat <<< 13531 1726882414.28143: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 13531 1726882414.28158: stdout chunk (state=3): >>># destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 13531 1726882414.28166: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 13531 1726882414.28209: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 13531 1726882414.28577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882414.28623: stderr chunk (state=3): >>><<< 13531 1726882414.28626: stdout chunk (state=3): >>><<< 13531 1726882414.28770: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7b1edc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7ac33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7b1eb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7b1eac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7ac3490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7ac3940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7ac3670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7a7a190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7a7a220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7a9d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7a7a940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7adb880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7a73d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7a9dd90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef7ac3970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77d3eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77d5f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77cc610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77d2640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77d3370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef768ee20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef768e910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef768ef10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef768efd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a10d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77aed90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77a7670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77b96d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77dae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef76a1cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77ae2b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef77b92e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77e09d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a1eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a1df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a1d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76743d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76744c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a9f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a3a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a3490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75c2220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef765f520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a3f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef77e0040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75d4b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef75d4e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75e5790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75e5cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef757e400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75d4f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef758f2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75e5610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef758f3a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a1a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef75aa700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef75aa9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75aa7c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef75aa8b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef75aad00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef75b5250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75aa940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef759ea90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef76a1610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef75aaaf0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9ef6fe76d0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f23820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6f23160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f23280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f23f70> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f234f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f23d90> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6f23fd0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f23100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6efa0d0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6dff310> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6dff160> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6dffca0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f0adc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f0a3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f0afd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f5ad30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6ef8d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6ef8400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6ed8b20> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6ef8520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6ef8550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6e6afd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f6c250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6e67850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f6c3d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f6cca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6e677f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6f04c10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6f6cfa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6f6c550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f65910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6e5d940> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6e7ad90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6e66580> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6e5dee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6e669a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6e797f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6eb48b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef69ed970> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6ee0730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6f25370> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6e97550> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef687f160> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6e9a910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6e9b790> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6e97b50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef69b2370> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef69cb580> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef69cb4f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef69a3280> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef69b2970> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef67697f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6769b20> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6a130a0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef69b90a0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6a13190> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef67d2fd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef69fd820> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6769d60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef66b5e80> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef66b59d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6732490> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef66b7850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6730670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef6730220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_zb1mpxh0/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9ef6674100> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef66c89a0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9ef66c8ee0> import 'gc' # {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "34", "epoch": "1726882414", "epoch_int": "1726882414", "date": "2024-09-20", "time": "21:33:34", "iso8601_micro": "2024-09-21T01:33:34.256750Z", "iso8601": "2024-09-21T01:33:34Z", "iso8601_basic": "20240920T213334256750", "iso8601_basic_short": "20240920T213334", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 13531 1726882414.29821: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882413.7653346-13645-159746643397705/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882414.29824: _low_level_execute_command(): starting 13531 1726882414.29826: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882413.7653346-13645-159746643397705/ > /dev/null 2>&1 && sleep 0' 13531 1726882414.30733: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882414.30815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882414.30826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882414.30840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882414.30884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882414.30891: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882414.30903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882414.30917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882414.30927: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882414.30934: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882414.30942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882414.30951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882414.31042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882414.31050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882414.31060: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882414.31074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882414.31145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882414.31254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882414.31272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882414.31399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882414.33280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882414.33284: stdout chunk (state=3): >>><<< 13531 1726882414.33290: stderr chunk (state=3): >>><<< 13531 1726882414.33310: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882414.33316: handler run complete 13531 1726882414.33366: variable 'ansible_facts' from source: unknown 13531 1726882414.33421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882414.33539: variable 'ansible_facts' from source: unknown 13531 1726882414.33588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882414.33643: attempt loop complete, returning result 13531 1726882414.33646: _execute() done 13531 1726882414.33648: dumping result to json 13531 1726882414.33665: done dumping result, returning 13531 1726882414.33674: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-4fd9-519d-0000000001cd] 13531 1726882414.33680: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001cd 13531 1726882414.33833: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001cd 13531 1726882414.33835: WORKER PROCESS EXITING ok: [managed_node2] 13531 1726882414.33954: no more pending results, returning what we have 13531 1726882414.33957: results queue empty 13531 1726882414.33957: checking for any_errors_fatal 13531 1726882414.33959: done checking for any_errors_fatal 13531 1726882414.33960: checking for max_fail_percentage 13531 1726882414.33961: done checking for max_fail_percentage 13531 1726882414.33962: checking to see if all hosts have failed and the running result is not ok 13531 1726882414.33963: done checking to see if all hosts have failed 13531 1726882414.33965: getting the remaining hosts for this loop 13531 1726882414.33966: done getting the remaining hosts for this loop 13531 1726882414.33970: getting the next task for host managed_node2 13531 1726882414.33977: done getting next task for host managed_node2 13531 1726882414.33980: ^ task is: TASK: Check if system is ostree 13531 1726882414.33982: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882414.33985: getting variables 13531 1726882414.33986: in VariableManager get_vars() 13531 1726882414.34013: Calling all_inventory to load vars for managed_node2 13531 1726882414.34016: Calling groups_inventory to load vars for managed_node2 13531 1726882414.34019: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882414.34030: Calling all_plugins_play to load vars for managed_node2 13531 1726882414.34033: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882414.34036: Calling groups_plugins_play to load vars for managed_node2 13531 1726882414.34192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882414.34419: done with get_vars() 13531 1726882414.34428: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:33:34 -0400 (0:00:00.739) 0:00:02.241 ****** 13531 1726882414.34592: entering _queue_task() for managed_node2/stat 13531 1726882414.35096: worker is 1 (out of 1 available) 13531 1726882414.35108: exiting _queue_task() for managed_node2/stat 13531 1726882414.35120: done queuing things up, now waiting for results queue to drain 13531 1726882414.35121: waiting for pending results... 13531 1726882414.35868: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 13531 1726882414.36070: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001cf 13531 1726882414.36083: variable 'ansible_search_path' from source: unknown 13531 1726882414.36086: variable 'ansible_search_path' from source: unknown 13531 1726882414.36122: calling self._execute() 13531 1726882414.36298: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882414.36302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882414.36312: variable 'omit' from source: magic vars 13531 1726882414.37018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882414.37625: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882414.37673: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882414.37800: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882414.37832: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882414.38034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882414.38066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882414.38093: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882414.38235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882414.38476: Evaluated conditional (not __network_is_ostree is defined): True 13531 1726882414.38484: variable 'omit' from source: magic vars 13531 1726882414.38525: variable 'omit' from source: magic vars 13531 1726882414.38681: variable 'omit' from source: magic vars 13531 1726882414.38710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882414.38737: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882414.38755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882414.38888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882414.38899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882414.38927: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882414.38930: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882414.38935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882414.39157: Set connection var ansible_pipelining to False 13531 1726882414.39166: Set connection var ansible_timeout to 10 13531 1726882414.39172: Set connection var ansible_shell_executable to /bin/sh 13531 1726882414.39178: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882414.39180: Set connection var ansible_connection to ssh 13531 1726882414.39182: Set connection var ansible_shell_type to sh 13531 1726882414.39324: variable 'ansible_shell_executable' from source: unknown 13531 1726882414.39421: variable 'ansible_connection' from source: unknown 13531 1726882414.39428: variable 'ansible_module_compression' from source: unknown 13531 1726882414.39435: variable 'ansible_shell_type' from source: unknown 13531 1726882414.39442: variable 'ansible_shell_executable' from source: unknown 13531 1726882414.39449: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882414.39460: variable 'ansible_pipelining' from source: unknown 13531 1726882414.39471: variable 'ansible_timeout' from source: unknown 13531 1726882414.39480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882414.39856: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882414.39876: variable 'omit' from source: magic vars 13531 1726882414.39888: starting attempt loop 13531 1726882414.39895: running the handler 13531 1726882414.39912: _low_level_execute_command(): starting 13531 1726882414.39926: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882414.41685: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882414.41727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882414.41772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882414.41775: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882414.41778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882414.41942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882414.41957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882414.42073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882414.43676: stdout chunk (state=3): >>>/root <<< 13531 1726882414.43775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882414.43866: stderr chunk (state=3): >>><<< 13531 1726882414.43870: stdout chunk (state=3): >>><<< 13531 1726882414.43989: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882414.44000: _low_level_execute_command(): starting 13531 1726882414.44003: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882414.4389226-13661-61383009584362 `" && echo ansible-tmp-1726882414.4389226-13661-61383009584362="` echo /root/.ansible/tmp/ansible-tmp-1726882414.4389226-13661-61383009584362 `" ) && sleep 0' 13531 1726882414.45507: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882414.45511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882414.45546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882414.45549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882414.45554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882414.45697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882414.45716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882414.45719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882414.45840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882414.47768: stdout chunk (state=3): >>>ansible-tmp-1726882414.4389226-13661-61383009584362=/root/.ansible/tmp/ansible-tmp-1726882414.4389226-13661-61383009584362 <<< 13531 1726882414.47872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882414.47956: stderr chunk (state=3): >>><<< 13531 1726882414.47960: stdout chunk (state=3): >>><<< 13531 1726882414.48270: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882414.4389226-13661-61383009584362=/root/.ansible/tmp/ansible-tmp-1726882414.4389226-13661-61383009584362 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882414.48273: variable 'ansible_module_compression' from source: unknown 13531 1726882414.48275: ANSIBALLZ: Using lock for stat 13531 1726882414.48277: ANSIBALLZ: Acquiring lock 13531 1726882414.48279: ANSIBALLZ: Lock acquired: 139969312288560 13531 1726882414.48280: ANSIBALLZ: Creating module 13531 1726882414.67101: ANSIBALLZ: Writing module into payload 13531 1726882414.67228: ANSIBALLZ: Writing module 13531 1726882414.67257: ANSIBALLZ: Renaming module 13531 1726882414.67273: ANSIBALLZ: Done creating module 13531 1726882414.67294: variable 'ansible_facts' from source: unknown 13531 1726882414.67375: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882414.4389226-13661-61383009584362/AnsiballZ_stat.py 13531 1726882414.67539: Sending initial data 13531 1726882414.67542: Sent initial data (152 bytes) 13531 1726882414.68806: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882414.68823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882414.68839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882414.68864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882414.68911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882414.68925: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882414.68941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882414.68960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882414.68976: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882414.68988: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882414.68999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882414.69011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882414.69025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882414.69035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882414.69048: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882414.69070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882414.69149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882414.69171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882414.69185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882414.69426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882414.71166: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882414.71260: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882414.71365: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpcpfq4ikz /root/.ansible/tmp/ansible-tmp-1726882414.4389226-13661-61383009584362/AnsiballZ_stat.py <<< 13531 1726882414.71458: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882414.72841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882414.72968: stderr chunk (state=3): >>><<< 13531 1726882414.72975: stdout chunk (state=3): >>><<< 13531 1726882414.72978: done transferring module to remote 13531 1726882414.73069: _low_level_execute_command(): starting 13531 1726882414.73073: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882414.4389226-13661-61383009584362/ /root/.ansible/tmp/ansible-tmp-1726882414.4389226-13661-61383009584362/AnsiballZ_stat.py && sleep 0' 13531 1726882414.73724: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882414.73736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882414.73757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882414.73776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882414.73815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882414.73826: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882414.73838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882414.73860: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882414.73875: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882414.73885: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882414.73896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882414.73907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882414.73949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882414.73975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882414.73987: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882414.74000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882414.74071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882414.74100: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882414.74116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882414.74246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882414.76136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882414.76211: stderr chunk (state=3): >>><<< 13531 1726882414.76215: stdout chunk (state=3): >>><<< 13531 1726882414.76290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882414.76296: _low_level_execute_command(): starting 13531 1726882414.76299: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882414.4389226-13661-61383009584362/AnsiballZ_stat.py && sleep 0' 13531 1726882414.76968: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882414.76985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882414.77000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882414.77019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882414.77077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882414.77091: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882414.77105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882414.77123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882414.77134: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882414.77146: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882414.77167: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882414.77182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882414.77198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882414.77210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882414.77220: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882414.77233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882414.77317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882414.77342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882414.77365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882414.77518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882414.79504: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 13531 1726882414.79566: stdout chunk (state=3): >>>import '_io' # <<< 13531 1726882414.79569: stdout chunk (state=3): >>>import 'marshal' # <<< 13531 1726882414.79599: stdout chunk (state=3): >>>import 'posix' # <<< 13531 1726882414.79638: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 13531 1726882414.79648: stdout chunk (state=3): >>># installing zipimport hook <<< 13531 1726882414.79678: stdout chunk (state=3): >>>import 'time' # <<< 13531 1726882414.79681: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 13531 1726882414.79763: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882414.79782: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 13531 1726882414.79800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 13531 1726882414.79803: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6296cdc0> <<< 13531 1726882414.79872: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b629113a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6296cb20> <<< 13531 1726882414.79900: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6296cac0> <<< 13531 1726882414.79922: stdout chunk (state=3): >>>import '_signal' # <<< 13531 1726882414.79950: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 13531 1726882414.80025: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62911490> <<< 13531 1726882414.80038: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62911940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62911670> <<< 13531 1726882414.80142: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 13531 1726882414.80215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 13531 1726882414.80219: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628c8190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 13531 1726882414.80468: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628c8220> <<< 13531 1726882414.80492: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628eb850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628c8940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62929880> <<< 13531 1726882414.80496: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628c1d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628ebd90> <<< 13531 1726882414.80555: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62911970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13531 1726882414.80774: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 13531 1726882414.80780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 13531 1726882414.80825: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 13531 1726882414.80994: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 13531 1726882414.81070: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62867eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6286af40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 13531 1726882414.81083: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62860610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62866640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62867370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 13531 1726882414.81236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 13531 1726882414.81392: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b625d4e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625d4910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625d4f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 13531 1726882414.81522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625d4fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e70d0> import '_collections' # <<< 13531 1726882414.81824: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62842d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6283b670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6284e6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6286ee20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b625e7cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628422b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b6284e2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628749d0> <<< 13531 1726882414.81839: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e7eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e7df0> <<< 13531 1726882414.81848: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e7d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 13531 1726882414.81889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 13531 1726882414.82096: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625ba3d0> <<< 13531 1726882414.82121: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 13531 1726882414.82154: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625ba4c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625eff40> <<< 13531 1726882414.82192: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e9a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e9490> <<< 13531 1726882414.82286: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 13531 1726882414.82292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624d4220> <<< 13531 1726882414.82308: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625a5520> <<< 13531 1726882414.82361: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e9f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62874040> <<< 13531 1726882414.82415: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 13531 1726882414.82458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624e6b50> import 'errno' # <<< 13531 1726882414.82501: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624e6e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 13531 1726882414.82532: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 13531 1726882414.82545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624f7790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 13531 1726882414.82585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 13531 1726882414.82598: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624f7cd0> <<< 13531 1726882414.82715: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b62485400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624e6f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 13531 1726882414.82740: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624962e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624f7610> import 'pwd' # <<< 13531 1726882414.82756: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624963a0> <<< 13531 1726882414.82819: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e7a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 13531 1726882414.82866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 13531 1726882414.82884: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 13531 1726882414.82888: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882414.82914: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624b1700> <<< 13531 1726882414.82931: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 13531 1726882414.82946: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624b19d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624b17c0> <<< 13531 1726882414.82973: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624b18b0> <<< 13531 1726882414.83000: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 13531 1726882414.83198: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624b1d00> <<< 13531 1726882414.83233: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882414.83267: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624bc250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624b1940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624a5a90> <<< 13531 1726882414.83286: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e7610> <<< 13531 1726882414.83298: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 13531 1726882414.83360: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 13531 1726882414.83391: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624b1af0> <<< 13531 1726882414.83491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 13531 1726882414.83503: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2b623cd6d0> <<< 13531 1726882414.83621: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip' <<< 13531 1726882414.83638: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.83711: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.83745: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 13531 1726882414.83774: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.83790: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 13531 1726882414.84989: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.85953: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61dd6820> <<< 13531 1726882414.85999: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882414.86028: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 13531 1726882414.86044: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 13531 1726882414.86047: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61dd6160> <<< 13531 1726882414.86080: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61dd6280> <<< 13531 1726882414.86122: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61dd6f70> <<< 13531 1726882414.86135: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 13531 1726882414.86188: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61dd64f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61dd6d90> import 'atexit' # <<< 13531 1726882414.86222: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61dd6fd0> <<< 13531 1726882414.86238: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 13531 1726882414.86267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 13531 1726882414.86308: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61dd6100> <<< 13531 1726882414.86327: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 13531 1726882414.86360: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 13531 1726882414.86395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 13531 1726882414.86398: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 13531 1726882414.86474: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61d2df40> <<< 13531 1726882414.86501: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d4cd00> <<< 13531 1726882414.86536: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d4ceb0> <<< 13531 1726882414.86549: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 13531 1726882414.86579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 13531 1726882414.86618: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61d4c370> <<< 13531 1726882414.86631: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62359dc0> <<< 13531 1726882414.86802: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b623593a0> <<< 13531 1726882414.86838: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 13531 1726882414.86851: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62359fd0> <<< 13531 1726882414.86891: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 13531 1726882414.86908: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 13531 1726882414.86949: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 13531 1726882414.86954: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6232ad30> <<< 13531 1726882414.87038: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61da9d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61da9400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61ddf4f0> <<< 13531 1726882414.87071: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61da9520> <<< 13531 1726882414.87108: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61da9550> <<< 13531 1726882414.87124: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 13531 1726882414.87145: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 13531 1726882414.87173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 13531 1726882414.87246: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882414.87255: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d1dfd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6233b250> <<< 13531 1726882414.87277: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 13531 1726882414.87283: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 13531 1726882414.87337: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 13531 1726882414.87340: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d1a850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6233b3d0> <<< 13531 1726882414.87365: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 13531 1726882414.87399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882414.87429: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 13531 1726882414.87432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 13531 1726882414.87434: stdout chunk (state=3): >>>import '_string' # <<< 13531 1726882414.87494: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62353e50> <<< 13531 1726882414.87618: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61d1a7f0> <<< 13531 1726882414.87711: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d1a640> <<< 13531 1726882414.87743: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d195b0> <<< 13531 1726882414.87787: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d0ed90> <<< 13531 1726882414.87807: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62332910> <<< 13531 1726882414.87817: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 13531 1726882414.87829: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 13531 1726882414.87849: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 13531 1726882414.87903: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d9f6a0> <<< 13531 1726882414.88085: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d9db20> <<< 13531 1726882414.88113: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61dad0a0> <<< 13531 1726882414.88146: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d9f100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61de2b20> # zipimport: zlib available <<< 13531 1726882414.88171: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 13531 1726882414.88187: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.88248: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.88339: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13531 1726882414.88377: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 13531 1726882414.88389: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.88483: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.88581: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.89036: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.89616: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882414.89632: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b618e15e0> <<< 13531 1726882414.89847: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61ceb580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61881100> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 13531 1726882414.89957: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.90035: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 13531 1726882414.90077: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61d9db80> # zipimport: zlib available <<< 13531 1726882414.90435: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.90801: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.90856: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.90974: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 13531 1726882414.90979: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.91029: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 13531 1726882414.91080: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.91178: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 13531 1726882414.91198: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 13531 1726882414.91240: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 13531 1726882414.91251: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.91425: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.91637: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 13531 1726882414.91661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 13531 1726882414.91739: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b618b1f10> <<< 13531 1726882414.91785: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.91798: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.91923: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available <<< 13531 1726882414.91961: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.91976: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 13531 1726882414.92037: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.92061: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.92133: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.92211: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 13531 1726882414.92228: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 13531 1726882414.92312: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b62346220> <<< 13531 1726882414.92346: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b618b1850> <<< 13531 1726882414.92373: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 13531 1726882414.92384: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.92500: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.92565: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.92581: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.92642: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 13531 1726882414.92695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 13531 1726882414.92713: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 13531 1726882414.92801: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61cddca0> <<< 13531 1726882414.92849: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61cd9f70> <<< 13531 1726882414.92897: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61cd2940> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 13531 1726882414.92919: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.92947: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.92967: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 13531 1726882414.93034: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 13531 1726882414.93057: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 13531 1726882414.93076: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.93182: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.93380: stdout chunk (state=3): >>># zipimport: zlib available <<< 13531 1726882414.93491: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 13531 1726882414.93853: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ <<< 13531 1726882414.93881: stdout chunk (state=3): >>># restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external<<< 13531 1726882414.93894: stdout chunk (state=3): >>> # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath <<< 13531 1726882414.93897: stdout chunk (state=3): >>># cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants <<< 13531 1726882414.93900: stdout chunk (state=3): >>># destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections <<< 13531 1726882414.93901: stdout chunk (state=3): >>># cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct <<< 13531 1726882414.93904: stdout chunk (state=3): >>># cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external <<< 13531 1726882414.93934: stdout chunk (state=3): >>># cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 13531 1726882414.94139: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 13531 1726882414.94228: stdout chunk (state=3): >>># destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 13531 1726882414.94241: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 13531 1726882414.94360: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess <<< 13531 1726882414.94423: stdout chunk (state=3): >>># cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale <<< 13531 1726882414.94449: stdout chunk (state=3): >>># cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 13531 1726882414.94620: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat <<< 13531 1726882414.94667: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 13531 1726882414.94694: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 13531 1726882414.95098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882414.95102: stdout chunk (state=3): >>><<< 13531 1726882414.95104: stderr chunk (state=3): >>><<< 13531 1726882414.95202: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6296cdc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b629113a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6296cb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6296cac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62911490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62911940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62911670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628c8190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628c8220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628eb850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628c8940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62929880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628c1d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628ebd90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62911970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62867eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6286af40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62860610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62866640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62867370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b625d4e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625d4910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625d4f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625d4fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e70d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62842d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6283b670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6284e6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6286ee20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b625e7cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628422b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b6284e2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b628749d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e7eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e7df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e7d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625ba3d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625ba4c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625eff40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e9a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e9490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624d4220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625a5520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e9f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62874040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624e6b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624e6e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624f7790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624f7cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b62485400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624e6f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624962e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624f7610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624963a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e7a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624b1700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624b19d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624b17c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624b18b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624b1d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b624bc250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624b1940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624a5a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b625e7610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b624b1af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2b623cd6d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61dd6820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61dd6160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61dd6280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61dd6f70> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61dd64f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61dd6d90> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61dd6fd0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61dd6100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61d2df40> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d4cd00> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d4ceb0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61d4c370> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62359dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b623593a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62359fd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6232ad30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61da9d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61da9400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61ddf4f0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61da9520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61da9550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d1dfd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6233b250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d1a850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b6233b3d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62353e50> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61d1a7f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d1a640> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d195b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d0ed90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b62332910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d9f6a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d9db20> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61dad0a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b61d9f100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61de2b20> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b618e15e0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61ceb580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61881100> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61d9db80> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b618b1f10> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b62346220> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b618b1850> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61cddca0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61cd9f70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b61cd2940> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_24fixa_l/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 13531 1726882414.95982: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882414.4389226-13661-61383009584362/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882414.95985: _low_level_execute_command(): starting 13531 1726882414.95987: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882414.4389226-13661-61383009584362/ > /dev/null 2>&1 && sleep 0' 13531 1726882414.96944: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882414.96972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882414.96990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882414.97014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882414.97060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882414.97078: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882414.97092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882414.97124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882414.97166: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882414.97196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882414.97229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882414.97286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882414.97304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882414.97318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882414.97333: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882414.97372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882414.97522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882414.97555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882414.97613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882414.97757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882414.99674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882414.99678: stdout chunk (state=3): >>><<< 13531 1726882414.99680: stderr chunk (state=3): >>><<< 13531 1726882415.00182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882415.00186: handler run complete 13531 1726882415.00189: attempt loop complete, returning result 13531 1726882415.00191: _execute() done 13531 1726882415.00194: dumping result to json 13531 1726882415.00196: done dumping result, returning 13531 1726882415.00198: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0e448fcc-3ce9-4fd9-519d-0000000001cf] 13531 1726882415.00210: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001cf 13531 1726882415.00286: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001cf 13531 1726882415.00291: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 13531 1726882415.00360: no more pending results, returning what we have 13531 1726882415.00365: results queue empty 13531 1726882415.00366: checking for any_errors_fatal 13531 1726882415.00373: done checking for any_errors_fatal 13531 1726882415.00374: checking for max_fail_percentage 13531 1726882415.00376: done checking for max_fail_percentage 13531 1726882415.00377: checking to see if all hosts have failed and the running result is not ok 13531 1726882415.00378: done checking to see if all hosts have failed 13531 1726882415.00378: getting the remaining hosts for this loop 13531 1726882415.00380: done getting the remaining hosts for this loop 13531 1726882415.00383: getting the next task for host managed_node2 13531 1726882415.00388: done getting next task for host managed_node2 13531 1726882415.00390: ^ task is: TASK: Set flag to indicate system is ostree 13531 1726882415.00393: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882415.00396: getting variables 13531 1726882415.00397: in VariableManager get_vars() 13531 1726882415.00424: Calling all_inventory to load vars for managed_node2 13531 1726882415.00427: Calling groups_inventory to load vars for managed_node2 13531 1726882415.00431: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.00440: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.00443: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.00447: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.00635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.00865: done with get_vars() 13531 1726882415.00875: done getting variables 13531 1726882415.00976: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:33:35 -0400 (0:00:00.664) 0:00:02.905 ****** 13531 1726882415.01018: entering _queue_task() for managed_node2/set_fact 13531 1726882415.01020: Creating lock for set_fact 13531 1726882415.01556: worker is 1 (out of 1 available) 13531 1726882415.01571: exiting _queue_task() for managed_node2/set_fact 13531 1726882415.01588: done queuing things up, now waiting for results queue to drain 13531 1726882415.01590: waiting for pending results... 13531 1726882415.01877: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 13531 1726882415.02100: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001d0 13531 1726882415.02114: variable 'ansible_search_path' from source: unknown 13531 1726882415.02121: variable 'ansible_search_path' from source: unknown 13531 1726882415.02195: calling self._execute() 13531 1726882415.02323: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.02326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.02339: variable 'omit' from source: magic vars 13531 1726882415.03226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882415.03978: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882415.04040: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882415.04117: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882415.04204: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882415.04416: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882415.04489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882415.04585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882415.04696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882415.04974: Evaluated conditional (not __network_is_ostree is defined): True 13531 1726882415.05006: variable 'omit' from source: magic vars 13531 1726882415.06094: variable 'omit' from source: magic vars 13531 1726882415.06347: variable '__ostree_booted_stat' from source: set_fact 13531 1726882415.06835: variable 'omit' from source: magic vars 13531 1726882415.06916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882415.06956: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882415.06991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882415.07089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882415.07129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882415.07195: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882415.07204: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.07212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.07355: Set connection var ansible_pipelining to False 13531 1726882415.07373: Set connection var ansible_timeout to 10 13531 1726882415.07402: Set connection var ansible_shell_executable to /bin/sh 13531 1726882415.07420: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882415.07433: Set connection var ansible_connection to ssh 13531 1726882415.07488: Set connection var ansible_shell_type to sh 13531 1726882415.07581: variable 'ansible_shell_executable' from source: unknown 13531 1726882415.07610: variable 'ansible_connection' from source: unknown 13531 1726882415.07636: variable 'ansible_module_compression' from source: unknown 13531 1726882415.07702: variable 'ansible_shell_type' from source: unknown 13531 1726882415.07715: variable 'ansible_shell_executable' from source: unknown 13531 1726882415.07730: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.07744: variable 'ansible_pipelining' from source: unknown 13531 1726882415.07755: variable 'ansible_timeout' from source: unknown 13531 1726882415.07767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.08209: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882415.08224: variable 'omit' from source: magic vars 13531 1726882415.08233: starting attempt loop 13531 1726882415.08239: running the handler 13531 1726882415.08297: handler run complete 13531 1726882415.08312: attempt loop complete, returning result 13531 1726882415.08418: _execute() done 13531 1726882415.08426: dumping result to json 13531 1726882415.08440: done dumping result, returning 13531 1726882415.08462: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-4fd9-519d-0000000001d0] 13531 1726882415.08489: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001d0 ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 13531 1726882415.08734: no more pending results, returning what we have 13531 1726882415.08736: results queue empty 13531 1726882415.08737: checking for any_errors_fatal 13531 1726882415.08745: done checking for any_errors_fatal 13531 1726882415.08746: checking for max_fail_percentage 13531 1726882415.08748: done checking for max_fail_percentage 13531 1726882415.08749: checking to see if all hosts have failed and the running result is not ok 13531 1726882415.08750: done checking to see if all hosts have failed 13531 1726882415.08750: getting the remaining hosts for this loop 13531 1726882415.08754: done getting the remaining hosts for this loop 13531 1726882415.08759: getting the next task for host managed_node2 13531 1726882415.08770: done getting next task for host managed_node2 13531 1726882415.08773: ^ task is: TASK: Fix CentOS6 Base repo 13531 1726882415.08776: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882415.08780: getting variables 13531 1726882415.08781: in VariableManager get_vars() 13531 1726882415.08812: Calling all_inventory to load vars for managed_node2 13531 1726882415.08815: Calling groups_inventory to load vars for managed_node2 13531 1726882415.08818: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.08830: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.08833: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.08836: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.09070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.09650: done with get_vars() 13531 1726882415.09665: done getting variables 13531 1726882415.09968: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001d0 13531 1726882415.09971: WORKER PROCESS EXITING 13531 1726882415.10073: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:33:35 -0400 (0:00:00.091) 0:00:02.997 ****** 13531 1726882415.10185: entering _queue_task() for managed_node2/copy 13531 1726882415.10515: worker is 1 (out of 1 available) 13531 1726882415.10527: exiting _queue_task() for managed_node2/copy 13531 1726882415.10537: done queuing things up, now waiting for results queue to drain 13531 1726882415.10539: waiting for pending results... 13531 1726882415.10861: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 13531 1726882415.10933: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001d2 13531 1726882415.10942: variable 'ansible_search_path' from source: unknown 13531 1726882415.10946: variable 'ansible_search_path' from source: unknown 13531 1726882415.10978: calling self._execute() 13531 1726882415.11037: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.11040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.11048: variable 'omit' from source: magic vars 13531 1726882415.11396: variable 'ansible_distribution' from source: facts 13531 1726882415.11413: Evaluated conditional (ansible_distribution == 'CentOS'): True 13531 1726882415.11525: variable 'ansible_distribution_major_version' from source: facts 13531 1726882415.11528: Evaluated conditional (ansible_distribution_major_version == '6'): False 13531 1726882415.11531: when evaluation is False, skipping this task 13531 1726882415.11534: _execute() done 13531 1726882415.11536: dumping result to json 13531 1726882415.11540: done dumping result, returning 13531 1726882415.11551: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-4fd9-519d-0000000001d2] 13531 1726882415.11557: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001d2 13531 1726882415.11702: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001d2 13531 1726882415.11704: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 13531 1726882415.11767: no more pending results, returning what we have 13531 1726882415.11770: results queue empty 13531 1726882415.11771: checking for any_errors_fatal 13531 1726882415.11781: done checking for any_errors_fatal 13531 1726882415.11782: checking for max_fail_percentage 13531 1726882415.11783: done checking for max_fail_percentage 13531 1726882415.11784: checking to see if all hosts have failed and the running result is not ok 13531 1726882415.11784: done checking to see if all hosts have failed 13531 1726882415.11785: getting the remaining hosts for this loop 13531 1726882415.11786: done getting the remaining hosts for this loop 13531 1726882415.11789: getting the next task for host managed_node2 13531 1726882415.11810: done getting next task for host managed_node2 13531 1726882415.11824: ^ task is: TASK: Include the task 'enable_epel.yml' 13531 1726882415.11827: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882415.11829: getting variables 13531 1726882415.11830: in VariableManager get_vars() 13531 1726882415.11849: Calling all_inventory to load vars for managed_node2 13531 1726882415.11851: Calling groups_inventory to load vars for managed_node2 13531 1726882415.11853: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.11860: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.11865: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.11868: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.12034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.12174: done with get_vars() 13531 1726882415.12186: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:33:35 -0400 (0:00:00.020) 0:00:03.018 ****** 13531 1726882415.12279: entering _queue_task() for managed_node2/include_tasks 13531 1726882415.13344: worker is 1 (out of 1 available) 13531 1726882415.13369: exiting _queue_task() for managed_node2/include_tasks 13531 1726882415.13442: done queuing things up, now waiting for results queue to drain 13531 1726882415.13443: waiting for pending results... 13531 1726882415.14392: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 13531 1726882415.14504: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001d3 13531 1726882415.14523: variable 'ansible_search_path' from source: unknown 13531 1726882415.14531: variable 'ansible_search_path' from source: unknown 13531 1726882415.14581: calling self._execute() 13531 1726882415.14726: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.14738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.14755: variable 'omit' from source: magic vars 13531 1726882415.15269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882415.18502: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882415.18582: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882415.18636: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882415.18679: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882415.18717: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882415.18814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882415.18851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882415.18886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882415.18940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882415.18961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882415.19090: variable '__network_is_ostree' from source: set_fact 13531 1726882415.19112: Evaluated conditional (not __network_is_ostree | d(false)): True 13531 1726882415.19122: _execute() done 13531 1726882415.19128: dumping result to json 13531 1726882415.19140: done dumping result, returning 13531 1726882415.19159: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-4fd9-519d-0000000001d3] 13531 1726882415.19172: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001d3 13531 1726882415.19301: no more pending results, returning what we have 13531 1726882415.19307: in VariableManager get_vars() 13531 1726882415.19342: Calling all_inventory to load vars for managed_node2 13531 1726882415.19345: Calling groups_inventory to load vars for managed_node2 13531 1726882415.19348: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.19364: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.19369: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.19373: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.19576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.19770: done with get_vars() 13531 1726882415.19779: variable 'ansible_search_path' from source: unknown 13531 1726882415.19780: variable 'ansible_search_path' from source: unknown 13531 1726882415.19821: we have included files to process 13531 1726882415.19822: generating all_blocks data 13531 1726882415.19824: done generating all_blocks data 13531 1726882415.19831: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13531 1726882415.19833: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13531 1726882415.19836: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13531 1726882415.20427: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001d3 13531 1726882415.20430: WORKER PROCESS EXITING 13531 1726882415.20944: done processing included file 13531 1726882415.20947: iterating over new_blocks loaded from include file 13531 1726882415.20948: in VariableManager get_vars() 13531 1726882415.21082: done with get_vars() 13531 1726882415.21084: filtering new block on tags 13531 1726882415.21109: done filtering new block on tags 13531 1726882415.21112: in VariableManager get_vars() 13531 1726882415.21125: done with get_vars() 13531 1726882415.21127: filtering new block on tags 13531 1726882415.21138: done filtering new block on tags 13531 1726882415.21140: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 13531 1726882415.21146: extending task lists for all hosts with included blocks 13531 1726882415.21416: done extending task lists 13531 1726882415.21418: done processing included files 13531 1726882415.21419: results queue empty 13531 1726882415.21419: checking for any_errors_fatal 13531 1726882415.21424: done checking for any_errors_fatal 13531 1726882415.21424: checking for max_fail_percentage 13531 1726882415.21425: done checking for max_fail_percentage 13531 1726882415.21426: checking to see if all hosts have failed and the running result is not ok 13531 1726882415.21427: done checking to see if all hosts have failed 13531 1726882415.21428: getting the remaining hosts for this loop 13531 1726882415.21429: done getting the remaining hosts for this loop 13531 1726882415.21431: getting the next task for host managed_node2 13531 1726882415.21435: done getting next task for host managed_node2 13531 1726882415.21437: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 13531 1726882415.21440: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882415.21442: getting variables 13531 1726882415.21443: in VariableManager get_vars() 13531 1726882415.21454: Calling all_inventory to load vars for managed_node2 13531 1726882415.21457: Calling groups_inventory to load vars for managed_node2 13531 1726882415.21459: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.21466: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.21475: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.21478: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.21974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.22194: done with get_vars() 13531 1726882415.22206: done getting variables 13531 1726882415.22314: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 13531 1726882415.22442: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:33:35 -0400 (0:00:00.102) 0:00:03.120 ****** 13531 1726882415.22499: entering _queue_task() for managed_node2/command 13531 1726882415.22501: Creating lock for command 13531 1726882415.22809: worker is 1 (out of 1 available) 13531 1726882415.22824: exiting _queue_task() for managed_node2/command 13531 1726882415.22835: done queuing things up, now waiting for results queue to drain 13531 1726882415.22836: waiting for pending results... 13531 1726882415.23668: running TaskExecutor() for managed_node2/TASK: Create EPEL 9 13531 1726882415.23760: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001ed 13531 1726882415.23779: variable 'ansible_search_path' from source: unknown 13531 1726882415.23782: variable 'ansible_search_path' from source: unknown 13531 1726882415.23814: calling self._execute() 13531 1726882415.23869: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.23872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.23879: variable 'omit' from source: magic vars 13531 1726882415.24156: variable 'ansible_distribution' from source: facts 13531 1726882415.24162: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13531 1726882415.24255: variable 'ansible_distribution_major_version' from source: facts 13531 1726882415.24259: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13531 1726882415.24263: when evaluation is False, skipping this task 13531 1726882415.24267: _execute() done 13531 1726882415.24269: dumping result to json 13531 1726882415.24272: done dumping result, returning 13531 1726882415.24277: done running TaskExecutor() for managed_node2/TASK: Create EPEL 9 [0e448fcc-3ce9-4fd9-519d-0000000001ed] 13531 1726882415.24284: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001ed 13531 1726882415.24378: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001ed 13531 1726882415.24381: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13531 1726882415.24440: no more pending results, returning what we have 13531 1726882415.24444: results queue empty 13531 1726882415.24445: checking for any_errors_fatal 13531 1726882415.24446: done checking for any_errors_fatal 13531 1726882415.24447: checking for max_fail_percentage 13531 1726882415.24448: done checking for max_fail_percentage 13531 1726882415.24449: checking to see if all hosts have failed and the running result is not ok 13531 1726882415.24449: done checking to see if all hosts have failed 13531 1726882415.24450: getting the remaining hosts for this loop 13531 1726882415.24454: done getting the remaining hosts for this loop 13531 1726882415.24457: getting the next task for host managed_node2 13531 1726882415.24462: done getting next task for host managed_node2 13531 1726882415.24467: ^ task is: TASK: Install yum-utils package 13531 1726882415.24471: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882415.24474: getting variables 13531 1726882415.24475: in VariableManager get_vars() 13531 1726882415.24500: Calling all_inventory to load vars for managed_node2 13531 1726882415.24503: Calling groups_inventory to load vars for managed_node2 13531 1726882415.24506: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.24515: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.24517: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.24520: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.24624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.24769: done with get_vars() 13531 1726882415.24776: done getting variables 13531 1726882415.24841: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:33:35 -0400 (0:00:00.023) 0:00:03.143 ****** 13531 1726882415.24866: entering _queue_task() for managed_node2/package 13531 1726882415.24867: Creating lock for package 13531 1726882415.25045: worker is 1 (out of 1 available) 13531 1726882415.25060: exiting _queue_task() for managed_node2/package 13531 1726882415.25073: done queuing things up, now waiting for results queue to drain 13531 1726882415.25074: waiting for pending results... 13531 1726882415.25243: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 13531 1726882415.25337: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001ee 13531 1726882415.25346: variable 'ansible_search_path' from source: unknown 13531 1726882415.25350: variable 'ansible_search_path' from source: unknown 13531 1726882415.25389: calling self._execute() 13531 1726882415.25476: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.25491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.25508: variable 'omit' from source: magic vars 13531 1726882415.25997: variable 'ansible_distribution' from source: facts 13531 1726882415.26013: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13531 1726882415.26168: variable 'ansible_distribution_major_version' from source: facts 13531 1726882415.26179: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13531 1726882415.26194: when evaluation is False, skipping this task 13531 1726882415.26205: _execute() done 13531 1726882415.26211: dumping result to json 13531 1726882415.26217: done dumping result, returning 13531 1726882415.26226: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0e448fcc-3ce9-4fd9-519d-0000000001ee] 13531 1726882415.26235: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001ee 13531 1726882415.26348: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001ee 13531 1726882415.26358: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13531 1726882415.26419: no more pending results, returning what we have 13531 1726882415.26423: results queue empty 13531 1726882415.26423: checking for any_errors_fatal 13531 1726882415.26435: done checking for any_errors_fatal 13531 1726882415.26436: checking for max_fail_percentage 13531 1726882415.26438: done checking for max_fail_percentage 13531 1726882415.26439: checking to see if all hosts have failed and the running result is not ok 13531 1726882415.26439: done checking to see if all hosts have failed 13531 1726882415.26440: getting the remaining hosts for this loop 13531 1726882415.26442: done getting the remaining hosts for this loop 13531 1726882415.26448: getting the next task for host managed_node2 13531 1726882415.26457: done getting next task for host managed_node2 13531 1726882415.26459: ^ task is: TASK: Enable EPEL 7 13531 1726882415.26465: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882415.26469: getting variables 13531 1726882415.26471: in VariableManager get_vars() 13531 1726882415.26508: Calling all_inventory to load vars for managed_node2 13531 1726882415.26511: Calling groups_inventory to load vars for managed_node2 13531 1726882415.26515: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.26528: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.26531: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.26535: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.26737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.27128: done with get_vars() 13531 1726882415.27138: done getting variables 13531 1726882415.27220: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:33:35 -0400 (0:00:00.025) 0:00:03.168 ****** 13531 1726882415.27373: entering _queue_task() for managed_node2/command 13531 1726882415.27772: worker is 1 (out of 1 available) 13531 1726882415.27789: exiting _queue_task() for managed_node2/command 13531 1726882415.27800: done queuing things up, now waiting for results queue to drain 13531 1726882415.27801: waiting for pending results... 13531 1726882415.28060: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 13531 1726882415.28180: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001ef 13531 1726882415.28218: variable 'ansible_search_path' from source: unknown 13531 1726882415.28226: variable 'ansible_search_path' from source: unknown 13531 1726882415.28289: calling self._execute() 13531 1726882415.28385: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.28389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.28397: variable 'omit' from source: magic vars 13531 1726882415.28694: variable 'ansible_distribution' from source: facts 13531 1726882415.28704: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13531 1726882415.28797: variable 'ansible_distribution_major_version' from source: facts 13531 1726882415.28803: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13531 1726882415.28806: when evaluation is False, skipping this task 13531 1726882415.28809: _execute() done 13531 1726882415.28812: dumping result to json 13531 1726882415.28814: done dumping result, returning 13531 1726882415.28820: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0e448fcc-3ce9-4fd9-519d-0000000001ef] 13531 1726882415.28828: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001ef skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13531 1726882415.28955: no more pending results, returning what we have 13531 1726882415.28959: results queue empty 13531 1726882415.28960: checking for any_errors_fatal 13531 1726882415.28967: done checking for any_errors_fatal 13531 1726882415.28968: checking for max_fail_percentage 13531 1726882415.28970: done checking for max_fail_percentage 13531 1726882415.28970: checking to see if all hosts have failed and the running result is not ok 13531 1726882415.28971: done checking to see if all hosts have failed 13531 1726882415.28972: getting the remaining hosts for this loop 13531 1726882415.28973: done getting the remaining hosts for this loop 13531 1726882415.28977: getting the next task for host managed_node2 13531 1726882415.28982: done getting next task for host managed_node2 13531 1726882415.28984: ^ task is: TASK: Enable EPEL 8 13531 1726882415.28988: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882415.28990: getting variables 13531 1726882415.28991: in VariableManager get_vars() 13531 1726882415.29015: Calling all_inventory to load vars for managed_node2 13531 1726882415.29017: Calling groups_inventory to load vars for managed_node2 13531 1726882415.29020: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.29029: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.29031: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.29034: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.29168: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001ef 13531 1726882415.29171: WORKER PROCESS EXITING 13531 1726882415.29181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.29295: done with get_vars() 13531 1726882415.29301: done getting variables 13531 1726882415.29338: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:33:35 -0400 (0:00:00.019) 0:00:03.188 ****** 13531 1726882415.29362: entering _queue_task() for managed_node2/command 13531 1726882415.29528: worker is 1 (out of 1 available) 13531 1726882415.29539: exiting _queue_task() for managed_node2/command 13531 1726882415.29554: done queuing things up, now waiting for results queue to drain 13531 1726882415.29555: waiting for pending results... 13531 1726882415.29715: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 13531 1726882415.29831: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001f0 13531 1726882415.29854: variable 'ansible_search_path' from source: unknown 13531 1726882415.29866: variable 'ansible_search_path' from source: unknown 13531 1726882415.29909: calling self._execute() 13531 1726882415.29987: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.30001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.30019: variable 'omit' from source: magic vars 13531 1726882415.30521: variable 'ansible_distribution' from source: facts 13531 1726882415.30540: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13531 1726882415.30689: variable 'ansible_distribution_major_version' from source: facts 13531 1726882415.30700: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13531 1726882415.30707: when evaluation is False, skipping this task 13531 1726882415.30713: _execute() done 13531 1726882415.30719: dumping result to json 13531 1726882415.30724: done dumping result, returning 13531 1726882415.30733: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0e448fcc-3ce9-4fd9-519d-0000000001f0] 13531 1726882415.30743: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001f0 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13531 1726882415.30904: no more pending results, returning what we have 13531 1726882415.30910: results queue empty 13531 1726882415.30911: checking for any_errors_fatal 13531 1726882415.30918: done checking for any_errors_fatal 13531 1726882415.30919: checking for max_fail_percentage 13531 1726882415.30921: done checking for max_fail_percentage 13531 1726882415.30922: checking to see if all hosts have failed and the running result is not ok 13531 1726882415.30923: done checking to see if all hosts have failed 13531 1726882415.30923: getting the remaining hosts for this loop 13531 1726882415.30925: done getting the remaining hosts for this loop 13531 1726882415.30928: getting the next task for host managed_node2 13531 1726882415.30937: done getting next task for host managed_node2 13531 1726882415.30939: ^ task is: TASK: Enable EPEL 6 13531 1726882415.30944: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882415.30947: getting variables 13531 1726882415.30949: in VariableManager get_vars() 13531 1726882415.30981: Calling all_inventory to load vars for managed_node2 13531 1726882415.30984: Calling groups_inventory to load vars for managed_node2 13531 1726882415.30987: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.31001: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.31004: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.31007: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.31192: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001f0 13531 1726882415.31196: WORKER PROCESS EXITING 13531 1726882415.31212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.31417: done with get_vars() 13531 1726882415.31426: done getting variables 13531 1726882415.31483: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:33:35 -0400 (0:00:00.021) 0:00:03.210 ****** 13531 1726882415.31511: entering _queue_task() for managed_node2/copy 13531 1726882415.31926: worker is 1 (out of 1 available) 13531 1726882415.31940: exiting _queue_task() for managed_node2/copy 13531 1726882415.31952: done queuing things up, now waiting for results queue to drain 13531 1726882415.31953: waiting for pending results... 13531 1726882415.32458: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 13531 1726882415.32517: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001f2 13531 1726882415.32533: variable 'ansible_search_path' from source: unknown 13531 1726882415.32536: variable 'ansible_search_path' from source: unknown 13531 1726882415.32569: calling self._execute() 13531 1726882415.32635: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.32641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.32651: variable 'omit' from source: magic vars 13531 1726882415.32920: variable 'ansible_distribution' from source: facts 13531 1726882415.32930: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13531 1726882415.33009: variable 'ansible_distribution_major_version' from source: facts 13531 1726882415.33014: Evaluated conditional (ansible_distribution_major_version == '6'): False 13531 1726882415.33016: when evaluation is False, skipping this task 13531 1726882415.33019: _execute() done 13531 1726882415.33022: dumping result to json 13531 1726882415.33025: done dumping result, returning 13531 1726882415.33032: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0e448fcc-3ce9-4fd9-519d-0000000001f2] 13531 1726882415.33038: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001f2 13531 1726882415.33125: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001f2 13531 1726882415.33128: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 13531 1726882415.33182: no more pending results, returning what we have 13531 1726882415.33185: results queue empty 13531 1726882415.33186: checking for any_errors_fatal 13531 1726882415.33190: done checking for any_errors_fatal 13531 1726882415.33191: checking for max_fail_percentage 13531 1726882415.33193: done checking for max_fail_percentage 13531 1726882415.33193: checking to see if all hosts have failed and the running result is not ok 13531 1726882415.33194: done checking to see if all hosts have failed 13531 1726882415.33195: getting the remaining hosts for this loop 13531 1726882415.33196: done getting the remaining hosts for this loop 13531 1726882415.33199: getting the next task for host managed_node2 13531 1726882415.33205: done getting next task for host managed_node2 13531 1726882415.33207: ^ task is: TASK: Set network provider to 'nm' 13531 1726882415.33209: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882415.33213: getting variables 13531 1726882415.33214: in VariableManager get_vars() 13531 1726882415.33240: Calling all_inventory to load vars for managed_node2 13531 1726882415.33243: Calling groups_inventory to load vars for managed_node2 13531 1726882415.33246: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.33253: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.33255: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.33257: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.33392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.33503: done with get_vars() 13531 1726882415.33509: done getting variables 13531 1726882415.33547: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:13 Friday 20 September 2024 21:33:35 -0400 (0:00:00.020) 0:00:03.230 ****** 13531 1726882415.33568: entering _queue_task() for managed_node2/set_fact 13531 1726882415.33730: worker is 1 (out of 1 available) 13531 1726882415.33744: exiting _queue_task() for managed_node2/set_fact 13531 1726882415.33755: done queuing things up, now waiting for results queue to drain 13531 1726882415.33757: waiting for pending results... 13531 1726882415.33905: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 13531 1726882415.33966: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000007 13531 1726882415.33977: variable 'ansible_search_path' from source: unknown 13531 1726882415.34008: calling self._execute() 13531 1726882415.34093: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.34097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.34105: variable 'omit' from source: magic vars 13531 1726882415.34208: variable 'omit' from source: magic vars 13531 1726882415.34259: variable 'omit' from source: magic vars 13531 1726882415.34285: variable 'omit' from source: magic vars 13531 1726882415.34327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882415.34375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882415.34391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882415.34404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882415.34414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882415.34448: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882415.34465: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.34485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.34587: Set connection var ansible_pipelining to False 13531 1726882415.34604: Set connection var ansible_timeout to 10 13531 1726882415.34613: Set connection var ansible_shell_executable to /bin/sh 13531 1726882415.34622: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882415.34627: Set connection var ansible_connection to ssh 13531 1726882415.34632: Set connection var ansible_shell_type to sh 13531 1726882415.34664: variable 'ansible_shell_executable' from source: unknown 13531 1726882415.34677: variable 'ansible_connection' from source: unknown 13531 1726882415.34683: variable 'ansible_module_compression' from source: unknown 13531 1726882415.34689: variable 'ansible_shell_type' from source: unknown 13531 1726882415.34694: variable 'ansible_shell_executable' from source: unknown 13531 1726882415.34699: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.34711: variable 'ansible_pipelining' from source: unknown 13531 1726882415.34720: variable 'ansible_timeout' from source: unknown 13531 1726882415.34727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.34873: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882415.34888: variable 'omit' from source: magic vars 13531 1726882415.34902: starting attempt loop 13531 1726882415.34908: running the handler 13531 1726882415.34924: handler run complete 13531 1726882415.34944: attempt loop complete, returning result 13531 1726882415.34951: _execute() done 13531 1726882415.34959: dumping result to json 13531 1726882415.34968: done dumping result, returning 13531 1726882415.34978: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0e448fcc-3ce9-4fd9-519d-000000000007] 13531 1726882415.34987: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000007 13531 1726882415.35090: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000007 13531 1726882415.35096: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 13531 1726882415.35221: no more pending results, returning what we have 13531 1726882415.35224: results queue empty 13531 1726882415.35225: checking for any_errors_fatal 13531 1726882415.35230: done checking for any_errors_fatal 13531 1726882415.35231: checking for max_fail_percentage 13531 1726882415.35232: done checking for max_fail_percentage 13531 1726882415.35235: checking to see if all hosts have failed and the running result is not ok 13531 1726882415.35236: done checking to see if all hosts have failed 13531 1726882415.35237: getting the remaining hosts for this loop 13531 1726882415.35238: done getting the remaining hosts for this loop 13531 1726882415.35241: getting the next task for host managed_node2 13531 1726882415.35246: done getting next task for host managed_node2 13531 1726882415.35248: ^ task is: TASK: meta (flush_handlers) 13531 1726882415.35249: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882415.35253: getting variables 13531 1726882415.35254: in VariableManager get_vars() 13531 1726882415.35286: Calling all_inventory to load vars for managed_node2 13531 1726882415.35289: Calling groups_inventory to load vars for managed_node2 13531 1726882415.35292: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.35301: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.35303: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.35306: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.35479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.35686: done with get_vars() 13531 1726882415.35698: done getting variables 13531 1726882415.35771: in VariableManager get_vars() 13531 1726882415.35780: Calling all_inventory to load vars for managed_node2 13531 1726882415.35782: Calling groups_inventory to load vars for managed_node2 13531 1726882415.35784: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.35788: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.35790: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.35793: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.35949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.36186: done with get_vars() 13531 1726882415.36199: done queuing things up, now waiting for results queue to drain 13531 1726882415.36200: results queue empty 13531 1726882415.36201: checking for any_errors_fatal 13531 1726882415.36203: done checking for any_errors_fatal 13531 1726882415.36203: checking for max_fail_percentage 13531 1726882415.36205: done checking for max_fail_percentage 13531 1726882415.36205: checking to see if all hosts have failed and the running result is not ok 13531 1726882415.36206: done checking to see if all hosts have failed 13531 1726882415.36207: getting the remaining hosts for this loop 13531 1726882415.36207: done getting the remaining hosts for this loop 13531 1726882415.36210: getting the next task for host managed_node2 13531 1726882415.36213: done getting next task for host managed_node2 13531 1726882415.36215: ^ task is: TASK: meta (flush_handlers) 13531 1726882415.36216: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882415.36223: getting variables 13531 1726882415.36224: in VariableManager get_vars() 13531 1726882415.36231: Calling all_inventory to load vars for managed_node2 13531 1726882415.36233: Calling groups_inventory to load vars for managed_node2 13531 1726882415.36236: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.36240: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.36243: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.36245: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.36399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.36519: done with get_vars() 13531 1726882415.36524: done getting variables 13531 1726882415.36559: in VariableManager get_vars() 13531 1726882415.36569: Calling all_inventory to load vars for managed_node2 13531 1726882415.36571: Calling groups_inventory to load vars for managed_node2 13531 1726882415.36573: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.36577: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.36580: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.36586: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.36726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.36847: done with get_vars() 13531 1726882415.36856: done queuing things up, now waiting for results queue to drain 13531 1726882415.36857: results queue empty 13531 1726882415.36858: checking for any_errors_fatal 13531 1726882415.36858: done checking for any_errors_fatal 13531 1726882415.36859: checking for max_fail_percentage 13531 1726882415.36859: done checking for max_fail_percentage 13531 1726882415.36860: checking to see if all hosts have failed and the running result is not ok 13531 1726882415.36860: done checking to see if all hosts have failed 13531 1726882415.36861: getting the remaining hosts for this loop 13531 1726882415.36861: done getting the remaining hosts for this loop 13531 1726882415.36863: getting the next task for host managed_node2 13531 1726882415.36866: done getting next task for host managed_node2 13531 1726882415.36866: ^ task is: None 13531 1726882415.36867: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882415.36868: done queuing things up, now waiting for results queue to drain 13531 1726882415.36869: results queue empty 13531 1726882415.36869: checking for any_errors_fatal 13531 1726882415.36869: done checking for any_errors_fatal 13531 1726882415.36870: checking for max_fail_percentage 13531 1726882415.36870: done checking for max_fail_percentage 13531 1726882415.36871: checking to see if all hosts have failed and the running result is not ok 13531 1726882415.36871: done checking to see if all hosts have failed 13531 1726882415.36872: getting the next task for host managed_node2 13531 1726882415.36874: done getting next task for host managed_node2 13531 1726882415.36874: ^ task is: None 13531 1726882415.36875: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882415.36914: in VariableManager get_vars() 13531 1726882415.36945: done with get_vars() 13531 1726882415.36950: in VariableManager get_vars() 13531 1726882415.36967: done with get_vars() 13531 1726882415.36970: variable 'omit' from source: magic vars 13531 1726882415.36990: in VariableManager get_vars() 13531 1726882415.37004: done with get_vars() 13531 1726882415.37018: variable 'omit' from source: magic vars PLAY [Play for testing bond removal] ******************************************* 13531 1726882415.37953: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 13531 1726882415.37979: getting the remaining hosts for this loop 13531 1726882415.37981: done getting the remaining hosts for this loop 13531 1726882415.37983: getting the next task for host managed_node2 13531 1726882415.37986: done getting next task for host managed_node2 13531 1726882415.37988: ^ task is: TASK: Gathering Facts 13531 1726882415.37989: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882415.37991: getting variables 13531 1726882415.37992: in VariableManager get_vars() 13531 1726882415.38010: Calling all_inventory to load vars for managed_node2 13531 1726882415.38012: Calling groups_inventory to load vars for managed_node2 13531 1726882415.38014: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882415.38019: Calling all_plugins_play to load vars for managed_node2 13531 1726882415.38031: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882415.38035: Calling groups_plugins_play to load vars for managed_node2 13531 1726882415.38166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882415.38346: done with get_vars() 13531 1726882415.38353: done getting variables 13531 1726882415.38394: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 Friday 20 September 2024 21:33:35 -0400 (0:00:00.048) 0:00:03.279 ****** 13531 1726882415.38419: entering _queue_task() for managed_node2/gather_facts 13531 1726882415.38603: worker is 1 (out of 1 available) 13531 1726882415.38613: exiting _queue_task() for managed_node2/gather_facts 13531 1726882415.38626: done queuing things up, now waiting for results queue to drain 13531 1726882415.38627: waiting for pending results... 13531 1726882415.38791: running TaskExecutor() for managed_node2/TASK: Gathering Facts 13531 1726882415.38855: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000218 13531 1726882415.38871: variable 'ansible_search_path' from source: unknown 13531 1726882415.38900: calling self._execute() 13531 1726882415.38965: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.38972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.38981: variable 'omit' from source: magic vars 13531 1726882415.39243: variable 'ansible_distribution_major_version' from source: facts 13531 1726882415.39251: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882415.39264: variable 'omit' from source: magic vars 13531 1726882415.39284: variable 'omit' from source: magic vars 13531 1726882415.39309: variable 'omit' from source: magic vars 13531 1726882415.39341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882415.39374: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882415.39390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882415.39406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882415.39415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882415.39437: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882415.39440: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.39442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.39517: Set connection var ansible_pipelining to False 13531 1726882415.39521: Set connection var ansible_timeout to 10 13531 1726882415.39527: Set connection var ansible_shell_executable to /bin/sh 13531 1726882415.39532: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882415.39534: Set connection var ansible_connection to ssh 13531 1726882415.39537: Set connection var ansible_shell_type to sh 13531 1726882415.39557: variable 'ansible_shell_executable' from source: unknown 13531 1726882415.39561: variable 'ansible_connection' from source: unknown 13531 1726882415.39564: variable 'ansible_module_compression' from source: unknown 13531 1726882415.39567: variable 'ansible_shell_type' from source: unknown 13531 1726882415.39569: variable 'ansible_shell_executable' from source: unknown 13531 1726882415.39572: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882415.39575: variable 'ansible_pipelining' from source: unknown 13531 1726882415.39578: variable 'ansible_timeout' from source: unknown 13531 1726882415.39580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882415.39706: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882415.39712: variable 'omit' from source: magic vars 13531 1726882415.39717: starting attempt loop 13531 1726882415.39721: running the handler 13531 1726882415.39733: variable 'ansible_facts' from source: unknown 13531 1726882415.39748: _low_level_execute_command(): starting 13531 1726882415.39756: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882415.40390: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882415.40502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882415.40518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882415.40538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882415.40551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882415.40689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882415.42356: stdout chunk (state=3): >>>/root <<< 13531 1726882415.42458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882415.42513: stderr chunk (state=3): >>><<< 13531 1726882415.42516: stdout chunk (state=3): >>><<< 13531 1726882415.42539: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882415.42557: _low_level_execute_command(): starting 13531 1726882415.42560: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882415.4253962-13710-249932382815925 `" && echo ansible-tmp-1726882415.4253962-13710-249932382815925="` echo /root/.ansible/tmp/ansible-tmp-1726882415.4253962-13710-249932382815925 `" ) && sleep 0' 13531 1726882415.43034: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882415.43048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882415.43070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882415.43087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882415.43141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882415.43153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882415.43266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882415.45145: stdout chunk (state=3): >>>ansible-tmp-1726882415.4253962-13710-249932382815925=/root/.ansible/tmp/ansible-tmp-1726882415.4253962-13710-249932382815925 <<< 13531 1726882415.45255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882415.45301: stderr chunk (state=3): >>><<< 13531 1726882415.45304: stdout chunk (state=3): >>><<< 13531 1726882415.45318: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882415.4253962-13710-249932382815925=/root/.ansible/tmp/ansible-tmp-1726882415.4253962-13710-249932382815925 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882415.45346: variable 'ansible_module_compression' from source: unknown 13531 1726882415.45394: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13531 1726882415.45439: variable 'ansible_facts' from source: unknown 13531 1726882415.45563: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882415.4253962-13710-249932382815925/AnsiballZ_setup.py 13531 1726882415.45679: Sending initial data 13531 1726882415.45688: Sent initial data (154 bytes) 13531 1726882415.46359: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882415.46362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882415.46400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882415.46403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882415.46406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882415.46462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882415.46468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882415.46578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882415.48307: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882415.48401: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882415.48576: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpnwr0dzvo /root/.ansible/tmp/ansible-tmp-1726882415.4253962-13710-249932382815925/AnsiballZ_setup.py <<< 13531 1726882415.48619: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882415.51109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882415.51457: stderr chunk (state=3): >>><<< 13531 1726882415.51460: stdout chunk (state=3): >>><<< 13531 1726882415.51465: done transferring module to remote 13531 1726882415.51467: _low_level_execute_command(): starting 13531 1726882415.51470: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882415.4253962-13710-249932382815925/ /root/.ansible/tmp/ansible-tmp-1726882415.4253962-13710-249932382815925/AnsiballZ_setup.py && sleep 0' 13531 1726882415.52201: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882415.52216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882415.52232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882415.52254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882415.52303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882415.52316: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882415.52330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882415.52350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882415.52370: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882415.52383: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882415.52396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882415.52410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882415.52427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882415.52441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882415.52458: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882415.52475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882415.52637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882415.52660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882415.52678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882415.52913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882415.54548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882415.54607: stderr chunk (state=3): >>><<< 13531 1726882415.54610: stdout chunk (state=3): >>><<< 13531 1726882415.54631: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882415.54634: _low_level_execute_command(): starting 13531 1726882415.54637: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882415.4253962-13710-249932382815925/AnsiballZ_setup.py && sleep 0' 13531 1726882415.55093: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882415.55096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882415.55139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882415.55142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 13531 1726882415.55144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882415.55146: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882415.55199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882415.55202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882415.55312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882417.06040: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-6<<< 13531 1726882417.06068: stdout chunk (state=3): >>>4G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.59, "5m": 0.38, "15m": 0.18}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "35", "epoch": "1726882415", "epoch_int": "1726882415", "date": "2024-09-20", "time": "21:33:35", "iso8601_micro": "2024-09-21T01:33:35.800469Z", "iso8601": "2024-09-21T01:33:35Z", "iso8601_basic": "20240920T213335800469", "iso8601_basic_short": "20240920T213335", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_process<<< 13531 1726882417.06095: stdout chunk (state=3): >>>or_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2823, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 709, "free": 2823}, "nocache": {"free": 3284, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 354, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241668096, "block_size": 4096, "block_total": 65519355, "block_available": 64512126, "block_used": 1007229, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_re<<< 13531 1726882417.06120: stdout chunk (state=3): >>>mcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addres<<< 13531 1726882417.06125: stdout chunk (state=3): >>>ses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13531 1726882417.07712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882417.07800: stderr chunk (state=3): >>><<< 13531 1726882417.07804: stdout chunk (state=3): >>><<< 13531 1726882417.08473: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.59, "5m": 0.38, "15m": 0.18}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "35", "epoch": "1726882415", "epoch_int": "1726882415", "date": "2024-09-20", "time": "21:33:35", "iso8601_micro": "2024-09-21T01:33:35.800469Z", "iso8601": "2024-09-21T01:33:35Z", "iso8601_basic": "20240920T213335800469", "iso8601_basic_short": "20240920T213335", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2823, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 709, "free": 2823}, "nocache": {"free": 3284, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 354, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241668096, "block_size": 4096, "block_total": 65519355, "block_available": 64512126, "block_used": 1007229, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882417.08485: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882415.4253962-13710-249932382815925/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882417.08488: _low_level_execute_command(): starting 13531 1726882417.08491: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882415.4253962-13710-249932382815925/ > /dev/null 2>&1 && sleep 0' 13531 1726882417.09115: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882417.09128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882417.09142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882417.09167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882417.09209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882417.09221: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882417.09234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882417.09255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882417.09271: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882417.09282: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882417.09294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882417.09308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882417.09324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882417.09336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882417.09347: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882417.09366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882417.09439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882417.09468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882417.09490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882417.09623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882417.11516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882417.11520: stdout chunk (state=3): >>><<< 13531 1726882417.11523: stderr chunk (state=3): >>><<< 13531 1726882417.12169: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882417.12173: handler run complete 13531 1726882417.12176: variable 'ansible_facts' from source: unknown 13531 1726882417.12178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882417.12180: variable 'ansible_facts' from source: unknown 13531 1726882417.12182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882417.12297: attempt loop complete, returning result 13531 1726882417.12307: _execute() done 13531 1726882417.12314: dumping result to json 13531 1726882417.12355: done dumping result, returning 13531 1726882417.12374: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-4fd9-519d-000000000218] 13531 1726882417.12385: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000218 ok: [managed_node2] 13531 1726882417.13184: no more pending results, returning what we have 13531 1726882417.13187: results queue empty 13531 1726882417.13188: checking for any_errors_fatal 13531 1726882417.13189: done checking for any_errors_fatal 13531 1726882417.13190: checking for max_fail_percentage 13531 1726882417.13191: done checking for max_fail_percentage 13531 1726882417.13192: checking to see if all hosts have failed and the running result is not ok 13531 1726882417.13193: done checking to see if all hosts have failed 13531 1726882417.13194: getting the remaining hosts for this loop 13531 1726882417.13195: done getting the remaining hosts for this loop 13531 1726882417.13198: getting the next task for host managed_node2 13531 1726882417.13204: done getting next task for host managed_node2 13531 1726882417.13206: ^ task is: TASK: meta (flush_handlers) 13531 1726882417.13207: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882417.13211: getting variables 13531 1726882417.13213: in VariableManager get_vars() 13531 1726882417.13260: Calling all_inventory to load vars for managed_node2 13531 1726882417.13263: Calling groups_inventory to load vars for managed_node2 13531 1726882417.13267: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882417.13278: Calling all_plugins_play to load vars for managed_node2 13531 1726882417.13281: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882417.13284: Calling groups_plugins_play to load vars for managed_node2 13531 1726882417.13439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882417.13879: done with get_vars() 13531 1726882417.13888: done getting variables 13531 1726882417.13915: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000218 13531 1726882417.13918: WORKER PROCESS EXITING 13531 1726882417.13958: in VariableManager get_vars() 13531 1726882417.13979: Calling all_inventory to load vars for managed_node2 13531 1726882417.13981: Calling groups_inventory to load vars for managed_node2 13531 1726882417.13983: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882417.13987: Calling all_plugins_play to load vars for managed_node2 13531 1726882417.13989: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882417.13996: Calling groups_plugins_play to load vars for managed_node2 13531 1726882417.14129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882417.14326: done with get_vars() 13531 1726882417.14337: done queuing things up, now waiting for results queue to drain 13531 1726882417.14339: results queue empty 13531 1726882417.14340: checking for any_errors_fatal 13531 1726882417.14343: done checking for any_errors_fatal 13531 1726882417.14343: checking for max_fail_percentage 13531 1726882417.14344: done checking for max_fail_percentage 13531 1726882417.14345: checking to see if all hosts have failed and the running result is not ok 13531 1726882417.14346: done checking to see if all hosts have failed 13531 1726882417.14346: getting the remaining hosts for this loop 13531 1726882417.14347: done getting the remaining hosts for this loop 13531 1726882417.14349: getting the next task for host managed_node2 13531 1726882417.14355: done getting next task for host managed_node2 13531 1726882417.14357: ^ task is: TASK: INIT Prepare setup 13531 1726882417.14358: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882417.14360: getting variables 13531 1726882417.14361: in VariableManager get_vars() 13531 1726882417.14379: Calling all_inventory to load vars for managed_node2 13531 1726882417.14381: Calling groups_inventory to load vars for managed_node2 13531 1726882417.14383: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882417.14388: Calling all_plugins_play to load vars for managed_node2 13531 1726882417.14390: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882417.14392: Calling groups_plugins_play to load vars for managed_node2 13531 1726882417.14522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882417.14945: done with get_vars() 13531 1726882417.14956: done getting variables 13531 1726882417.15030: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:15 Friday 20 September 2024 21:33:37 -0400 (0:00:01.766) 0:00:05.045 ****** 13531 1726882417.15058: entering _queue_task() for managed_node2/debug 13531 1726882417.15059: Creating lock for debug 13531 1726882417.15330: worker is 1 (out of 1 available) 13531 1726882417.15343: exiting _queue_task() for managed_node2/debug 13531 1726882417.15359: done queuing things up, now waiting for results queue to drain 13531 1726882417.15360: waiting for pending results... 13531 1726882417.15629: running TaskExecutor() for managed_node2/TASK: INIT Prepare setup 13531 1726882417.15729: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000000b 13531 1726882417.15750: variable 'ansible_search_path' from source: unknown 13531 1726882417.15796: calling self._execute() 13531 1726882417.15891: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882417.15903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882417.15922: variable 'omit' from source: magic vars 13531 1726882417.16401: variable 'ansible_distribution_major_version' from source: facts 13531 1726882417.16420: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882417.16431: variable 'omit' from source: magic vars 13531 1726882417.16458: variable 'omit' from source: magic vars 13531 1726882417.16503: variable 'omit' from source: magic vars 13531 1726882417.16556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882417.16600: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882417.16623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882417.16643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882417.16662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882417.16700: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882417.16708: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882417.16715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882417.16828: Set connection var ansible_pipelining to False 13531 1726882417.16839: Set connection var ansible_timeout to 10 13531 1726882417.16848: Set connection var ansible_shell_executable to /bin/sh 13531 1726882417.16860: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882417.16869: Set connection var ansible_connection to ssh 13531 1726882417.16875: Set connection var ansible_shell_type to sh 13531 1726882417.16909: variable 'ansible_shell_executable' from source: unknown 13531 1726882417.16916: variable 'ansible_connection' from source: unknown 13531 1726882417.16922: variable 'ansible_module_compression' from source: unknown 13531 1726882417.16929: variable 'ansible_shell_type' from source: unknown 13531 1726882417.16935: variable 'ansible_shell_executable' from source: unknown 13531 1726882417.16941: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882417.16947: variable 'ansible_pipelining' from source: unknown 13531 1726882417.16956: variable 'ansible_timeout' from source: unknown 13531 1726882417.16966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882417.17124: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882417.17140: variable 'omit' from source: magic vars 13531 1726882417.17150: starting attempt loop 13531 1726882417.17161: running the handler 13531 1726882417.17220: handler run complete 13531 1726882417.17254: attempt loop complete, returning result 13531 1726882417.17261: _execute() done 13531 1726882417.17269: dumping result to json 13531 1726882417.17276: done dumping result, returning 13531 1726882417.17285: done running TaskExecutor() for managed_node2/TASK: INIT Prepare setup [0e448fcc-3ce9-4fd9-519d-00000000000b] 13531 1726882417.17294: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000000b ok: [managed_node2] => {} MSG: ################################################## 13531 1726882417.17439: no more pending results, returning what we have 13531 1726882417.17442: results queue empty 13531 1726882417.17443: checking for any_errors_fatal 13531 1726882417.17445: done checking for any_errors_fatal 13531 1726882417.17446: checking for max_fail_percentage 13531 1726882417.17448: done checking for max_fail_percentage 13531 1726882417.17449: checking to see if all hosts have failed and the running result is not ok 13531 1726882417.17449: done checking to see if all hosts have failed 13531 1726882417.17450: getting the remaining hosts for this loop 13531 1726882417.17454: done getting the remaining hosts for this loop 13531 1726882417.17458: getting the next task for host managed_node2 13531 1726882417.17467: done getting next task for host managed_node2 13531 1726882417.17471: ^ task is: TASK: Install dnsmasq 13531 1726882417.17475: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882417.17480: getting variables 13531 1726882417.17482: in VariableManager get_vars() 13531 1726882417.17610: Calling all_inventory to load vars for managed_node2 13531 1726882417.17613: Calling groups_inventory to load vars for managed_node2 13531 1726882417.17616: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882417.17626: Calling all_plugins_play to load vars for managed_node2 13531 1726882417.17629: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882417.17631: Calling groups_plugins_play to load vars for managed_node2 13531 1726882417.17795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882417.18062: done with get_vars() 13531 1726882417.18073: done getting variables 13531 1726882417.18124: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:33:37 -0400 (0:00:00.030) 0:00:05.076 ****** 13531 1726882417.18161: entering _queue_task() for managed_node2/package 13531 1726882417.18182: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000000b 13531 1726882417.18192: WORKER PROCESS EXITING 13531 1726882417.18662: worker is 1 (out of 1 available) 13531 1726882417.18676: exiting _queue_task() for managed_node2/package 13531 1726882417.18688: done queuing things up, now waiting for results queue to drain 13531 1726882417.18689: waiting for pending results... 13531 1726882417.19347: running TaskExecutor() for managed_node2/TASK: Install dnsmasq 13531 1726882417.19546: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000000f 13531 1726882417.19695: variable 'ansible_search_path' from source: unknown 13531 1726882417.19831: variable 'ansible_search_path' from source: unknown 13531 1726882417.19878: calling self._execute() 13531 1726882417.20086: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882417.20097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882417.20112: variable 'omit' from source: magic vars 13531 1726882417.21030: variable 'ansible_distribution_major_version' from source: facts 13531 1726882417.21049: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882417.21067: variable 'omit' from source: magic vars 13531 1726882417.21119: variable 'omit' from source: magic vars 13531 1726882417.21560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882417.25173: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882417.25254: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882417.25343: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882417.25404: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882417.25510: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882417.25638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882417.25688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882417.25722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882417.25788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882417.25813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882417.25945: variable '__network_is_ostree' from source: set_fact 13531 1726882417.25971: variable 'omit' from source: magic vars 13531 1726882417.26009: variable 'omit' from source: magic vars 13531 1726882417.26046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882417.26097: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882417.26117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882417.26146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882417.26167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882417.26214: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882417.26223: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882417.26236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882417.26374: Set connection var ansible_pipelining to False 13531 1726882417.26385: Set connection var ansible_timeout to 10 13531 1726882417.26399: Set connection var ansible_shell_executable to /bin/sh 13531 1726882417.26419: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882417.26426: Set connection var ansible_connection to ssh 13531 1726882417.26433: Set connection var ansible_shell_type to sh 13531 1726882417.26479: variable 'ansible_shell_executable' from source: unknown 13531 1726882417.26486: variable 'ansible_connection' from source: unknown 13531 1726882417.26492: variable 'ansible_module_compression' from source: unknown 13531 1726882417.26498: variable 'ansible_shell_type' from source: unknown 13531 1726882417.26508: variable 'ansible_shell_executable' from source: unknown 13531 1726882417.26518: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882417.26530: variable 'ansible_pipelining' from source: unknown 13531 1726882417.26535: variable 'ansible_timeout' from source: unknown 13531 1726882417.26542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882417.26657: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882417.26678: variable 'omit' from source: magic vars 13531 1726882417.26691: starting attempt loop 13531 1726882417.26698: running the handler 13531 1726882417.26709: variable 'ansible_facts' from source: unknown 13531 1726882417.26715: variable 'ansible_facts' from source: unknown 13531 1726882417.26767: _low_level_execute_command(): starting 13531 1726882417.26780: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882417.28235: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882417.28239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882417.28270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882417.28274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882417.28344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882417.28349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882417.28354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882417.28476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882417.30124: stdout chunk (state=3): >>>/root <<< 13531 1726882417.30224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882417.30310: stderr chunk (state=3): >>><<< 13531 1726882417.30313: stdout chunk (state=3): >>><<< 13531 1726882417.30435: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882417.30444: _low_level_execute_command(): starting 13531 1726882417.30447: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882417.3033297-13781-166835468523810 `" && echo ansible-tmp-1726882417.3033297-13781-166835468523810="` echo /root/.ansible/tmp/ansible-tmp-1726882417.3033297-13781-166835468523810 `" ) && sleep 0' 13531 1726882417.31159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882417.31176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882417.31191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882417.31208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882417.31251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882417.31270: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882417.31285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882417.31302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882417.31315: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882417.31326: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882417.31339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882417.31355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882417.31373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882417.31385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882417.31395: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882417.31408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882417.31490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882417.31511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882417.31526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882417.31658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882417.33534: stdout chunk (state=3): >>>ansible-tmp-1726882417.3033297-13781-166835468523810=/root/.ansible/tmp/ansible-tmp-1726882417.3033297-13781-166835468523810 <<< 13531 1726882417.33679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882417.33736: stderr chunk (state=3): >>><<< 13531 1726882417.33739: stdout chunk (state=3): >>><<< 13531 1726882417.33975: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882417.3033297-13781-166835468523810=/root/.ansible/tmp/ansible-tmp-1726882417.3033297-13781-166835468523810 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882417.33978: variable 'ansible_module_compression' from source: unknown 13531 1726882417.33981: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 13531 1726882417.33984: ANSIBALLZ: Acquiring lock 13531 1726882417.33986: ANSIBALLZ: Lock acquired: 139969312288320 13531 1726882417.33988: ANSIBALLZ: Creating module 13531 1726882417.51990: ANSIBALLZ: Writing module into payload 13531 1726882417.52287: ANSIBALLZ: Writing module 13531 1726882417.52322: ANSIBALLZ: Renaming module 13531 1726882417.52333: ANSIBALLZ: Done creating module 13531 1726882417.52370: variable 'ansible_facts' from source: unknown 13531 1726882417.52464: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882417.3033297-13781-166835468523810/AnsiballZ_dnf.py 13531 1726882417.52638: Sending initial data 13531 1726882417.52641: Sent initial data (152 bytes) 13531 1726882417.53734: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882417.53749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882417.53776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882417.53796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882417.53839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882417.53851: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882417.53870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882417.53897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882417.53909: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882417.53921: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882417.53934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882417.53949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882417.53968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882417.53985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882417.54004: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882417.54020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882417.54102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882417.54130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882417.54148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882417.54293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882417.56123: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882417.56216: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882417.56315: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpuvplg4_c /root/.ansible/tmp/ansible-tmp-1726882417.3033297-13781-166835468523810/AnsiballZ_dnf.py <<< 13531 1726882417.56412: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882417.58143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882417.58341: stderr chunk (state=3): >>><<< 13531 1726882417.58344: stdout chunk (state=3): >>><<< 13531 1726882417.58346: done transferring module to remote 13531 1726882417.58348: _low_level_execute_command(): starting 13531 1726882417.58350: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882417.3033297-13781-166835468523810/ /root/.ansible/tmp/ansible-tmp-1726882417.3033297-13781-166835468523810/AnsiballZ_dnf.py && sleep 0' 13531 1726882417.59032: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882417.59045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882417.59058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882417.59079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882417.59145: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882417.59158: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882417.59176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882417.59195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882417.59208: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882417.59228: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882417.59241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882417.59255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882417.59274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882417.59287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882417.59298: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882417.59313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882417.59398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882417.59420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882417.59439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882417.59586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882417.61403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882417.61406: stdout chunk (state=3): >>><<< 13531 1726882417.61408: stderr chunk (state=3): >>><<< 13531 1726882417.61498: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882417.61502: _low_level_execute_command(): starting 13531 1726882417.61504: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882417.3033297-13781-166835468523810/AnsiballZ_dnf.py && sleep 0' 13531 1726882417.62106: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882417.62120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882417.62136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882417.62154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882417.63072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882417.63406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882417.63782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882418.65667: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 13531 1726882418.71274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882418.71278: stdout chunk (state=3): >>><<< 13531 1726882418.71284: stderr chunk (state=3): >>><<< 13531 1726882418.71301: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882418.71343: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882417.3033297-13781-166835468523810/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882418.71350: _low_level_execute_command(): starting 13531 1726882418.71357: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882417.3033297-13781-166835468523810/ > /dev/null 2>&1 && sleep 0' 13531 1726882418.72968: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882418.73683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882418.73693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882418.73706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882418.73951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882418.73955: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882418.73957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882418.73959: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882418.73961: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882418.73965: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882418.73967: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882418.73968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882418.73970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882418.73972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882418.73974: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882418.73975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882418.73977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882418.73979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882418.73980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882418.74280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882418.76017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882418.76097: stderr chunk (state=3): >>><<< 13531 1726882418.76100: stdout chunk (state=3): >>><<< 13531 1726882418.76117: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882418.76124: handler run complete 13531 1726882418.76298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882418.76481: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882418.76518: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882418.76548: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882418.76582: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882418.76651: variable '__install_status' from source: unknown 13531 1726882418.76675: Evaluated conditional (__install_status is success): True 13531 1726882418.76691: attempt loop complete, returning result 13531 1726882418.76694: _execute() done 13531 1726882418.76696: dumping result to json 13531 1726882418.76703: done dumping result, returning 13531 1726882418.76710: done running TaskExecutor() for managed_node2/TASK: Install dnsmasq [0e448fcc-3ce9-4fd9-519d-00000000000f] 13531 1726882418.76716: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000000f 13531 1726882418.76824: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000000f 13531 1726882418.76827: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 13531 1726882418.76919: no more pending results, returning what we have 13531 1726882418.76922: results queue empty 13531 1726882418.76923: checking for any_errors_fatal 13531 1726882418.76930: done checking for any_errors_fatal 13531 1726882418.76931: checking for max_fail_percentage 13531 1726882418.76932: done checking for max_fail_percentage 13531 1726882418.76933: checking to see if all hosts have failed and the running result is not ok 13531 1726882418.76934: done checking to see if all hosts have failed 13531 1726882418.76934: getting the remaining hosts for this loop 13531 1726882418.76937: done getting the remaining hosts for this loop 13531 1726882418.76940: getting the next task for host managed_node2 13531 1726882418.76945: done getting next task for host managed_node2 13531 1726882418.76948: ^ task is: TASK: Install pgrep, sysctl 13531 1726882418.76951: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882418.76954: getting variables 13531 1726882418.76955: in VariableManager get_vars() 13531 1726882418.77007: Calling all_inventory to load vars for managed_node2 13531 1726882418.77010: Calling groups_inventory to load vars for managed_node2 13531 1726882418.77012: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882418.77022: Calling all_plugins_play to load vars for managed_node2 13531 1726882418.77024: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882418.77031: Calling groups_plugins_play to load vars for managed_node2 13531 1726882418.77196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882418.77440: done with get_vars() 13531 1726882418.77451: done getting variables 13531 1726882418.77857: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 21:33:38 -0400 (0:00:01.597) 0:00:06.674 ****** 13531 1726882418.77887: entering _queue_task() for managed_node2/package 13531 1726882418.78333: worker is 1 (out of 1 available) 13531 1726882418.78343: exiting _queue_task() for managed_node2/package 13531 1726882418.78355: done queuing things up, now waiting for results queue to drain 13531 1726882418.78359: waiting for pending results... 13531 1726882418.79343: running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl 13531 1726882418.79478: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000010 13531 1726882418.79497: variable 'ansible_search_path' from source: unknown 13531 1726882418.79507: variable 'ansible_search_path' from source: unknown 13531 1726882418.79550: calling self._execute() 13531 1726882418.79642: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882418.79662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882418.79679: variable 'omit' from source: magic vars 13531 1726882418.80044: variable 'ansible_distribution_major_version' from source: facts 13531 1726882418.80065: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882418.80183: variable 'ansible_os_family' from source: facts 13531 1726882418.80193: Evaluated conditional (ansible_os_family == 'RedHat'): True 13531 1726882418.80373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882418.80650: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882418.80708: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882418.80745: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882418.80788: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882418.80877: variable 'ansible_distribution_major_version' from source: facts 13531 1726882418.80894: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 13531 1726882418.80901: when evaluation is False, skipping this task 13531 1726882418.80906: _execute() done 13531 1726882418.80912: dumping result to json 13531 1726882418.80917: done dumping result, returning 13531 1726882418.80925: done running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl [0e448fcc-3ce9-4fd9-519d-000000000010] 13531 1726882418.80934: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000010 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 13531 1726882418.81120: no more pending results, returning what we have 13531 1726882418.81123: results queue empty 13531 1726882418.81124: checking for any_errors_fatal 13531 1726882418.81134: done checking for any_errors_fatal 13531 1726882418.81134: checking for max_fail_percentage 13531 1726882418.81136: done checking for max_fail_percentage 13531 1726882418.81137: checking to see if all hosts have failed and the running result is not ok 13531 1726882418.81138: done checking to see if all hosts have failed 13531 1726882418.81138: getting the remaining hosts for this loop 13531 1726882418.81139: done getting the remaining hosts for this loop 13531 1726882418.81143: getting the next task for host managed_node2 13531 1726882418.81150: done getting next task for host managed_node2 13531 1726882418.81155: ^ task is: TASK: Install pgrep, sysctl 13531 1726882418.81158: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882418.81162: getting variables 13531 1726882418.81166: in VariableManager get_vars() 13531 1726882418.81223: Calling all_inventory to load vars for managed_node2 13531 1726882418.81226: Calling groups_inventory to load vars for managed_node2 13531 1726882418.81229: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882418.81241: Calling all_plugins_play to load vars for managed_node2 13531 1726882418.81244: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882418.81247: Calling groups_plugins_play to load vars for managed_node2 13531 1726882418.81541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882418.81757: done with get_vars() 13531 1726882418.81771: done getting variables 13531 1726882418.81834: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 21:33:38 -0400 (0:00:00.039) 0:00:06.713 ****** 13531 1726882418.81872: entering _queue_task() for managed_node2/package 13531 1726882418.81891: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000010 13531 1726882418.81899: WORKER PROCESS EXITING 13531 1726882418.82379: worker is 1 (out of 1 available) 13531 1726882418.82390: exiting _queue_task() for managed_node2/package 13531 1726882418.82402: done queuing things up, now waiting for results queue to drain 13531 1726882418.82403: waiting for pending results... 13531 1726882418.82640: running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl 13531 1726882418.82749: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000011 13531 1726882418.82770: variable 'ansible_search_path' from source: unknown 13531 1726882418.82777: variable 'ansible_search_path' from source: unknown 13531 1726882418.82817: calling self._execute() 13531 1726882418.82905: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882418.82915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882418.82928: variable 'omit' from source: magic vars 13531 1726882418.83302: variable 'ansible_distribution_major_version' from source: facts 13531 1726882418.83317: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882418.83430: variable 'ansible_os_family' from source: facts 13531 1726882418.83441: Evaluated conditional (ansible_os_family == 'RedHat'): True 13531 1726882418.83626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882418.83973: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882418.84015: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882418.84050: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882418.84093: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882418.84173: variable 'ansible_distribution_major_version' from source: facts 13531 1726882418.84189: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 13531 1726882418.84197: variable 'omit' from source: magic vars 13531 1726882418.84242: variable 'omit' from source: magic vars 13531 1726882418.84410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882418.88912: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882418.89041: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882418.89220: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882418.89265: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882418.89314: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882418.89527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882418.89642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882418.89680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882418.89839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882418.89865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882418.90091: variable '__network_is_ostree' from source: set_fact 13531 1726882418.90101: variable 'omit' from source: magic vars 13531 1726882418.90131: variable 'omit' from source: magic vars 13531 1726882418.90276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882418.90304: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882418.90323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882418.90341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882418.90356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882418.90390: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882418.90397: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882418.90483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882418.90596: Set connection var ansible_pipelining to False 13531 1726882418.90675: Set connection var ansible_timeout to 10 13531 1726882418.90684: Set connection var ansible_shell_executable to /bin/sh 13531 1726882418.90695: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882418.90700: Set connection var ansible_connection to ssh 13531 1726882418.90706: Set connection var ansible_shell_type to sh 13531 1726882418.90733: variable 'ansible_shell_executable' from source: unknown 13531 1726882418.90739: variable 'ansible_connection' from source: unknown 13531 1726882418.90744: variable 'ansible_module_compression' from source: unknown 13531 1726882418.90749: variable 'ansible_shell_type' from source: unknown 13531 1726882418.90756: variable 'ansible_shell_executable' from source: unknown 13531 1726882418.90761: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882418.90806: variable 'ansible_pipelining' from source: unknown 13531 1726882418.90813: variable 'ansible_timeout' from source: unknown 13531 1726882418.90820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882418.90963: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882418.91028: variable 'omit' from source: magic vars 13531 1726882418.91076: starting attempt loop 13531 1726882418.91083: running the handler 13531 1726882418.91093: variable 'ansible_facts' from source: unknown 13531 1726882418.91099: variable 'ansible_facts' from source: unknown 13531 1726882418.91180: _low_level_execute_command(): starting 13531 1726882418.91190: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882418.94322: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882418.94326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882418.94360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882418.94366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882418.94369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882418.94378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882418.94421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882418.94432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882418.94558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882418.96284: stdout chunk (state=3): >>>/root <<< 13531 1726882418.96339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882418.96417: stderr chunk (state=3): >>><<< 13531 1726882418.96420: stdout chunk (state=3): >>><<< 13531 1726882418.96555: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882418.96559: _low_level_execute_command(): starting 13531 1726882418.96563: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882418.9644628-13867-230231670997745 `" && echo ansible-tmp-1726882418.9644628-13867-230231670997745="` echo /root/.ansible/tmp/ansible-tmp-1726882418.9644628-13867-230231670997745 `" ) && sleep 0' 13531 1726882418.97856: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882418.97860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882418.97897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882418.97900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882418.97903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882418.97905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882418.98795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882418.99040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882418.99043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882418.99158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882419.01034: stdout chunk (state=3): >>>ansible-tmp-1726882418.9644628-13867-230231670997745=/root/.ansible/tmp/ansible-tmp-1726882418.9644628-13867-230231670997745 <<< 13531 1726882419.01231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882419.01234: stdout chunk (state=3): >>><<< 13531 1726882419.01236: stderr chunk (state=3): >>><<< 13531 1726882419.01581: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882418.9644628-13867-230231670997745=/root/.ansible/tmp/ansible-tmp-1726882418.9644628-13867-230231670997745 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882419.01585: variable 'ansible_module_compression' from source: unknown 13531 1726882419.01588: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 13531 1726882419.01591: variable 'ansible_facts' from source: unknown 13531 1726882419.01594: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882418.9644628-13867-230231670997745/AnsiballZ_dnf.py 13531 1726882419.02025: Sending initial data 13531 1726882419.02028: Sent initial data (152 bytes) 13531 1726882419.03592: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882419.03889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882419.03899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882419.03913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882419.03954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882419.03958: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882419.03970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882419.03986: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882419.03995: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882419.04002: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882419.04010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882419.04019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882419.04031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882419.04038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882419.04045: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882419.04056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882419.04122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882419.04141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882419.04155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882419.04285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882419.06028: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882419.06122: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882419.06227: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpqewn4zql /root/.ansible/tmp/ansible-tmp-1726882418.9644628-13867-230231670997745/AnsiballZ_dnf.py <<< 13531 1726882419.06319: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882419.08494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882419.08683: stderr chunk (state=3): >>><<< 13531 1726882419.08686: stdout chunk (state=3): >>><<< 13531 1726882419.08688: done transferring module to remote 13531 1726882419.08691: _low_level_execute_command(): starting 13531 1726882419.08693: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882418.9644628-13867-230231670997745/ /root/.ansible/tmp/ansible-tmp-1726882418.9644628-13867-230231670997745/AnsiballZ_dnf.py && sleep 0' 13531 1726882419.09495: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882419.09509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882419.09523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882419.09541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882419.09587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882419.09602: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882419.09617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882419.09635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882419.09647: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882419.09659: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882419.09673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882419.09688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882419.09708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882419.09720: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882419.09730: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882419.09742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882419.09822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882419.09844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882419.09858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882419.09995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882419.11856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882419.11859: stdout chunk (state=3): >>><<< 13531 1726882419.11862: stderr chunk (state=3): >>><<< 13531 1726882419.11959: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882419.11963: _low_level_execute_command(): starting 13531 1726882419.11967: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882418.9644628-13867-230231670997745/AnsiballZ_dnf.py && sleep 0' 13531 1726882419.13502: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882419.13506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882419.13509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882419.13511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882419.13542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882419.13611: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882419.13637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882419.13672: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882419.13686: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882419.13697: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882419.13709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882419.13723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882419.13738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882419.13771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882419.13785: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882419.13799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882419.13937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882419.13961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882419.13992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882419.14140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882420.15826: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 13531 1726882420.21575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882420.21600: stdout chunk (state=3): >>><<< 13531 1726882420.21604: stderr chunk (state=3): >>><<< 13531 1726882420.21748: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882420.21755: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882418.9644628-13867-230231670997745/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882420.21758: _low_level_execute_command(): starting 13531 1726882420.21761: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882418.9644628-13867-230231670997745/ > /dev/null 2>&1 && sleep 0' 13531 1726882420.22357: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882420.22377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882420.22392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882420.22414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882420.22459: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882420.22476: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882420.22490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882420.22510: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882420.22524: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882420.22536: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882420.22549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882420.22570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882420.22587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882420.22600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882420.22612: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882420.22632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882420.22712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882420.22737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882420.22759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882420.22892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882420.24798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882420.24801: stdout chunk (state=3): >>><<< 13531 1726882420.24804: stderr chunk (state=3): >>><<< 13531 1726882420.24870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882420.24874: handler run complete 13531 1726882420.25073: attempt loop complete, returning result 13531 1726882420.25076: _execute() done 13531 1726882420.25078: dumping result to json 13531 1726882420.25084: done dumping result, returning 13531 1726882420.25087: done running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl [0e448fcc-3ce9-4fd9-519d-000000000011] 13531 1726882420.25089: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000011 13531 1726882420.25171: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000011 13531 1726882420.25176: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 13531 1726882420.25255: no more pending results, returning what we have 13531 1726882420.25258: results queue empty 13531 1726882420.25259: checking for any_errors_fatal 13531 1726882420.25268: done checking for any_errors_fatal 13531 1726882420.25269: checking for max_fail_percentage 13531 1726882420.25271: done checking for max_fail_percentage 13531 1726882420.25271: checking to see if all hosts have failed and the running result is not ok 13531 1726882420.25272: done checking to see if all hosts have failed 13531 1726882420.25273: getting the remaining hosts for this loop 13531 1726882420.25274: done getting the remaining hosts for this loop 13531 1726882420.25277: getting the next task for host managed_node2 13531 1726882420.25283: done getting next task for host managed_node2 13531 1726882420.25289: ^ task is: TASK: Create test interfaces 13531 1726882420.25292: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882420.25297: getting variables 13531 1726882420.25302: in VariableManager get_vars() 13531 1726882420.25363: Calling all_inventory to load vars for managed_node2 13531 1726882420.25371: Calling groups_inventory to load vars for managed_node2 13531 1726882420.25374: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882420.25385: Calling all_plugins_play to load vars for managed_node2 13531 1726882420.25387: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882420.25390: Calling groups_plugins_play to load vars for managed_node2 13531 1726882420.25866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882420.26101: done with get_vars() 13531 1726882420.26111: done getting variables 13531 1726882420.26210: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 21:33:40 -0400 (0:00:01.443) 0:00:08.157 ****** 13531 1726882420.26237: entering _queue_task() for managed_node2/shell 13531 1726882420.26239: Creating lock for shell 13531 1726882420.26549: worker is 1 (out of 1 available) 13531 1726882420.26563: exiting _queue_task() for managed_node2/shell 13531 1726882420.26582: done queuing things up, now waiting for results queue to drain 13531 1726882420.26584: waiting for pending results... 13531 1726882420.26849: running TaskExecutor() for managed_node2/TASK: Create test interfaces 13531 1726882420.26979: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000012 13531 1726882420.26999: variable 'ansible_search_path' from source: unknown 13531 1726882420.27007: variable 'ansible_search_path' from source: unknown 13531 1726882420.27061: calling self._execute() 13531 1726882420.27171: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882420.27183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882420.27197: variable 'omit' from source: magic vars 13531 1726882420.27627: variable 'ansible_distribution_major_version' from source: facts 13531 1726882420.27644: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882420.27657: variable 'omit' from source: magic vars 13531 1726882420.27722: variable 'omit' from source: magic vars 13531 1726882420.28179: variable 'dhcp_interface1' from source: play vars 13531 1726882420.28191: variable 'dhcp_interface2' from source: play vars 13531 1726882420.28239: variable 'omit' from source: magic vars 13531 1726882420.28295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882420.28343: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882420.28370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882420.28392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882420.28408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882420.28452: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882420.28464: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882420.28473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882420.28864: Set connection var ansible_pipelining to False 13531 1726882420.28870: Set connection var ansible_timeout to 10 13531 1726882420.28876: Set connection var ansible_shell_executable to /bin/sh 13531 1726882420.28883: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882420.28886: Set connection var ansible_connection to ssh 13531 1726882420.28888: Set connection var ansible_shell_type to sh 13531 1726882420.28913: variable 'ansible_shell_executable' from source: unknown 13531 1726882420.28916: variable 'ansible_connection' from source: unknown 13531 1726882420.28918: variable 'ansible_module_compression' from source: unknown 13531 1726882420.28920: variable 'ansible_shell_type' from source: unknown 13531 1726882420.28923: variable 'ansible_shell_executable' from source: unknown 13531 1726882420.28925: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882420.28930: variable 'ansible_pipelining' from source: unknown 13531 1726882420.28932: variable 'ansible_timeout' from source: unknown 13531 1726882420.28935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882420.29060: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882420.29071: variable 'omit' from source: magic vars 13531 1726882420.29077: starting attempt loop 13531 1726882420.29080: running the handler 13531 1726882420.29089: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882420.29108: _low_level_execute_command(): starting 13531 1726882420.29115: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882420.29814: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882420.29825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882420.29836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882420.29850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882420.29892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882420.29899: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882420.29909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882420.29922: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882420.29930: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882420.29937: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882420.29944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882420.29957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882420.29968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882420.29976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882420.29985: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882420.29993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882420.30066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882420.30084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882420.30097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882420.30224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882420.31840: stdout chunk (state=3): >>>/root <<< 13531 1726882420.31939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882420.32033: stderr chunk (state=3): >>><<< 13531 1726882420.32047: stdout chunk (state=3): >>><<< 13531 1726882420.32187: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882420.32197: _low_level_execute_command(): starting 13531 1726882420.32199: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882420.3209307-13949-37598295920611 `" && echo ansible-tmp-1726882420.3209307-13949-37598295920611="` echo /root/.ansible/tmp/ansible-tmp-1726882420.3209307-13949-37598295920611 `" ) && sleep 0' 13531 1726882420.32777: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882420.32790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882420.32803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882420.32819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882420.32868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882420.32879: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882420.32889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882420.32903: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882420.32912: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882420.32920: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882420.32929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882420.32943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882420.32960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882420.32973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882420.32983: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882420.32996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882420.33166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882420.33193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882420.33209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882420.33341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882420.35201: stdout chunk (state=3): >>>ansible-tmp-1726882420.3209307-13949-37598295920611=/root/.ansible/tmp/ansible-tmp-1726882420.3209307-13949-37598295920611 <<< 13531 1726882420.35397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882420.35401: stdout chunk (state=3): >>><<< 13531 1726882420.35403: stderr chunk (state=3): >>><<< 13531 1726882420.35572: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882420.3209307-13949-37598295920611=/root/.ansible/tmp/ansible-tmp-1726882420.3209307-13949-37598295920611 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882420.35576: variable 'ansible_module_compression' from source: unknown 13531 1726882420.35579: ANSIBALLZ: Using generic lock for ansible.legacy.command 13531 1726882420.35581: ANSIBALLZ: Acquiring lock 13531 1726882420.35583: ANSIBALLZ: Lock acquired: 139969312288320 13531 1726882420.35586: ANSIBALLZ: Creating module 13531 1726882420.50790: ANSIBALLZ: Writing module into payload 13531 1726882420.50919: ANSIBALLZ: Writing module 13531 1726882420.50952: ANSIBALLZ: Renaming module 13531 1726882420.50975: ANSIBALLZ: Done creating module 13531 1726882420.50997: variable 'ansible_facts' from source: unknown 13531 1726882420.51080: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882420.3209307-13949-37598295920611/AnsiballZ_command.py 13531 1726882420.51244: Sending initial data 13531 1726882420.51246: Sent initial data (155 bytes) 13531 1726882420.52283: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882420.52297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882420.52311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882420.52329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882420.52386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882420.52487: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882420.52503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882420.52521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882420.52533: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882420.52545: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882420.52560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882420.52577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882420.52597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882420.52610: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882420.52622: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882420.52636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882420.52712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882420.52732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882420.52745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882420.52882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882420.54710: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882420.54807: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882420.54909: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmphb7ozk6s /root/.ansible/tmp/ansible-tmp-1726882420.3209307-13949-37598295920611/AnsiballZ_command.py <<< 13531 1726882420.55002: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882420.56486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882420.56684: stderr chunk (state=3): >>><<< 13531 1726882420.56687: stdout chunk (state=3): >>><<< 13531 1726882420.56690: done transferring module to remote 13531 1726882420.56692: _low_level_execute_command(): starting 13531 1726882420.56695: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882420.3209307-13949-37598295920611/ /root/.ansible/tmp/ansible-tmp-1726882420.3209307-13949-37598295920611/AnsiballZ_command.py && sleep 0' 13531 1726882420.57607: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882420.57610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882420.57651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882420.57659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882420.57662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882420.57665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882420.57714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882420.58685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882420.58812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882420.60521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882420.60601: stderr chunk (state=3): >>><<< 13531 1726882420.60604: stdout chunk (state=3): >>><<< 13531 1726882420.60672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882420.60677: _low_level_execute_command(): starting 13531 1726882420.60679: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882420.3209307-13949-37598295920611/AnsiballZ_command.py && sleep 0' 13531 1726882420.62118: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882420.62122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882420.62161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882420.62167: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882420.62169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882420.62229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882420.62233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882420.62377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882421.97089: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6692 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6692 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.co<<< 13531 1726882421.97121: stdout chunk (state=3): >>>m/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:33:40.753045", "end": "2024-09-20 21:33:41.968694", "delta": "0:00:01.215649", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882421.98526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882421.98529: stdout chunk (state=3): >>><<< 13531 1726882421.98532: stderr chunk (state=3): >>><<< 13531 1726882421.98695: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6692 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6692 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:33:40.753045", "end": "2024-09-20 21:33:41.968694", "delta": "0:00:01.215649", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882421.98706: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882420.3209307-13949-37598295920611/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882421.98710: _low_level_execute_command(): starting 13531 1726882421.98713: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882420.3209307-13949-37598295920611/ > /dev/null 2>&1 && sleep 0' 13531 1726882421.99581: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882421.99597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882421.99611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882421.99628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882421.99676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882421.99687: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882421.99701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882421.99717: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882421.99728: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882421.99737: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882421.99747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882421.99759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882421.99777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882421.99788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882421.99801: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882421.99816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882421.99898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882421.99922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882421.99939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882422.00069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882422.01997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882422.02001: stdout chunk (state=3): >>><<< 13531 1726882422.02003: stderr chunk (state=3): >>><<< 13531 1726882422.02471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882422.02475: handler run complete 13531 1726882422.02478: Evaluated conditional (False): False 13531 1726882422.02480: attempt loop complete, returning result 13531 1726882422.02482: _execute() done 13531 1726882422.02488: dumping result to json 13531 1726882422.02490: done dumping result, returning 13531 1726882422.02492: done running TaskExecutor() for managed_node2/TASK: Create test interfaces [0e448fcc-3ce9-4fd9-519d-000000000012] 13531 1726882422.02495: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000012 13531 1726882422.02578: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000012 13531 1726882422.02581: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.215649", "end": "2024-09-20 21:33:41.968694", "rc": 0, "start": "2024-09-20 21:33:40.753045" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 6692 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 6692 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + grep -q 'inet [1-9]' + ip addr show testbr + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 13531 1726882422.02656: no more pending results, returning what we have 13531 1726882422.02659: results queue empty 13531 1726882422.02660: checking for any_errors_fatal 13531 1726882422.02673: done checking for any_errors_fatal 13531 1726882422.02674: checking for max_fail_percentage 13531 1726882422.02676: done checking for max_fail_percentage 13531 1726882422.02677: checking to see if all hosts have failed and the running result is not ok 13531 1726882422.02677: done checking to see if all hosts have failed 13531 1726882422.02678: getting the remaining hosts for this loop 13531 1726882422.02679: done getting the remaining hosts for this loop 13531 1726882422.02683: getting the next task for host managed_node2 13531 1726882422.02691: done getting next task for host managed_node2 13531 1726882422.02694: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13531 1726882422.02697: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882422.02700: getting variables 13531 1726882422.02702: in VariableManager get_vars() 13531 1726882422.02751: Calling all_inventory to load vars for managed_node2 13531 1726882422.02757: Calling groups_inventory to load vars for managed_node2 13531 1726882422.02760: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882422.02773: Calling all_plugins_play to load vars for managed_node2 13531 1726882422.02776: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882422.02780: Calling groups_plugins_play to load vars for managed_node2 13531 1726882422.02956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882422.03166: done with get_vars() 13531 1726882422.03178: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:42 -0400 (0:00:01.770) 0:00:09.928 ****** 13531 1726882422.03282: entering _queue_task() for managed_node2/include_tasks 13531 1726882422.03608: worker is 1 (out of 1 available) 13531 1726882422.03621: exiting _queue_task() for managed_node2/include_tasks 13531 1726882422.03635: done queuing things up, now waiting for results queue to drain 13531 1726882422.03636: waiting for pending results... 13531 1726882422.03910: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 13531 1726882422.04030: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000016 13531 1726882422.04047: variable 'ansible_search_path' from source: unknown 13531 1726882422.04057: variable 'ansible_search_path' from source: unknown 13531 1726882422.04105: calling self._execute() 13531 1726882422.04196: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.04205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.04218: variable 'omit' from source: magic vars 13531 1726882422.04606: variable 'ansible_distribution_major_version' from source: facts 13531 1726882422.04627: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882422.04638: _execute() done 13531 1726882422.04645: dumping result to json 13531 1726882422.04656: done dumping result, returning 13531 1726882422.04671: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-4fd9-519d-000000000016] 13531 1726882422.04681: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000016 13531 1726882422.04809: no more pending results, returning what we have 13531 1726882422.04815: in VariableManager get_vars() 13531 1726882422.04882: Calling all_inventory to load vars for managed_node2 13531 1726882422.04885: Calling groups_inventory to load vars for managed_node2 13531 1726882422.04888: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882422.04903: Calling all_plugins_play to load vars for managed_node2 13531 1726882422.04906: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882422.04908: Calling groups_plugins_play to load vars for managed_node2 13531 1726882422.05581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882422.05793: done with get_vars() 13531 1726882422.05801: variable 'ansible_search_path' from source: unknown 13531 1726882422.05803: variable 'ansible_search_path' from source: unknown 13531 1726882422.05845: we have included files to process 13531 1726882422.05846: generating all_blocks data 13531 1726882422.05847: done generating all_blocks data 13531 1726882422.05848: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13531 1726882422.05850: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13531 1726882422.05851: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13531 1726882422.06772: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000016 13531 1726882422.06776: WORKER PROCESS EXITING 13531 1726882422.06880: done processing included file 13531 1726882422.06882: iterating over new_blocks loaded from include file 13531 1726882422.06884: in VariableManager get_vars() 13531 1726882422.06910: done with get_vars() 13531 1726882422.06912: filtering new block on tags 13531 1726882422.06926: done filtering new block on tags 13531 1726882422.06928: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 13531 1726882422.06932: extending task lists for all hosts with included blocks 13531 1726882422.07020: done extending task lists 13531 1726882422.07021: done processing included files 13531 1726882422.07022: results queue empty 13531 1726882422.07023: checking for any_errors_fatal 13531 1726882422.07030: done checking for any_errors_fatal 13531 1726882422.07031: checking for max_fail_percentage 13531 1726882422.07032: done checking for max_fail_percentage 13531 1726882422.07033: checking to see if all hosts have failed and the running result is not ok 13531 1726882422.07034: done checking to see if all hosts have failed 13531 1726882422.07035: getting the remaining hosts for this loop 13531 1726882422.07036: done getting the remaining hosts for this loop 13531 1726882422.07039: getting the next task for host managed_node2 13531 1726882422.07043: done getting next task for host managed_node2 13531 1726882422.07046: ^ task is: TASK: Get stat for interface {{ interface }} 13531 1726882422.07049: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882422.07051: getting variables 13531 1726882422.07055: in VariableManager get_vars() 13531 1726882422.07080: Calling all_inventory to load vars for managed_node2 13531 1726882422.07083: Calling groups_inventory to load vars for managed_node2 13531 1726882422.07085: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882422.07092: Calling all_plugins_play to load vars for managed_node2 13531 1726882422.07094: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882422.07097: Calling groups_plugins_play to load vars for managed_node2 13531 1726882422.07759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882422.08591: done with get_vars() 13531 1726882422.08604: done getting variables 13531 1726882422.08780: variable 'interface' from source: task vars 13531 1726882422.08785: variable 'dhcp_interface1' from source: play vars 13531 1726882422.08847: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:42 -0400 (0:00:00.056) 0:00:09.984 ****** 13531 1726882422.08895: entering _queue_task() for managed_node2/stat 13531 1726882422.09209: worker is 1 (out of 1 available) 13531 1726882422.09221: exiting _queue_task() for managed_node2/stat 13531 1726882422.09233: done queuing things up, now waiting for results queue to drain 13531 1726882422.09234: waiting for pending results... 13531 1726882422.09516: running TaskExecutor() for managed_node2/TASK: Get stat for interface test1 13531 1726882422.09656: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000248 13531 1726882422.09679: variable 'ansible_search_path' from source: unknown 13531 1726882422.09688: variable 'ansible_search_path' from source: unknown 13531 1726882422.09732: calling self._execute() 13531 1726882422.09829: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.09842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.09861: variable 'omit' from source: magic vars 13531 1726882422.10211: variable 'ansible_distribution_major_version' from source: facts 13531 1726882422.10234: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882422.10246: variable 'omit' from source: magic vars 13531 1726882422.10310: variable 'omit' from source: magic vars 13531 1726882422.10413: variable 'interface' from source: task vars 13531 1726882422.10423: variable 'dhcp_interface1' from source: play vars 13531 1726882422.10494: variable 'dhcp_interface1' from source: play vars 13531 1726882422.10516: variable 'omit' from source: magic vars 13531 1726882422.10573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882422.10700: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882422.10723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882422.10747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882422.10767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882422.10800: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882422.10807: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.10814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.10925: Set connection var ansible_pipelining to False 13531 1726882422.10935: Set connection var ansible_timeout to 10 13531 1726882422.10944: Set connection var ansible_shell_executable to /bin/sh 13531 1726882422.10955: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882422.10961: Set connection var ansible_connection to ssh 13531 1726882422.10970: Set connection var ansible_shell_type to sh 13531 1726882422.11002: variable 'ansible_shell_executable' from source: unknown 13531 1726882422.11009: variable 'ansible_connection' from source: unknown 13531 1726882422.11015: variable 'ansible_module_compression' from source: unknown 13531 1726882422.11020: variable 'ansible_shell_type' from source: unknown 13531 1726882422.11026: variable 'ansible_shell_executable' from source: unknown 13531 1726882422.11031: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.11039: variable 'ansible_pipelining' from source: unknown 13531 1726882422.11047: variable 'ansible_timeout' from source: unknown 13531 1726882422.11056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.11262: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882422.11280: variable 'omit' from source: magic vars 13531 1726882422.11290: starting attempt loop 13531 1726882422.11295: running the handler 13531 1726882422.11316: _low_level_execute_command(): starting 13531 1726882422.11327: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882422.12100: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882422.12115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.12129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.12146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.12196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.12211: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882422.12224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.12241: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882422.12256: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882422.12271: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882422.12283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.12296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.12310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.12324: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.12335: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882422.12347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.12429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882422.12456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882422.12474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882422.12611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882422.14293: stdout chunk (state=3): >>>/root <<< 13531 1726882422.14391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882422.14488: stderr chunk (state=3): >>><<< 13531 1726882422.14500: stdout chunk (state=3): >>><<< 13531 1726882422.14637: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882422.14641: _low_level_execute_command(): starting 13531 1726882422.14644: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882422.1453626-14047-269846955311674 `" && echo ansible-tmp-1726882422.1453626-14047-269846955311674="` echo /root/.ansible/tmp/ansible-tmp-1726882422.1453626-14047-269846955311674 `" ) && sleep 0' 13531 1726882422.15242: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882422.15258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.15274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.15296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.15340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.15351: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882422.15369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.15385: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882422.15399: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882422.15408: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882422.15419: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.15430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.15445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.15480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.15493: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882422.15509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.15589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882422.15610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882422.15633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882422.15768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882422.17667: stdout chunk (state=3): >>>ansible-tmp-1726882422.1453626-14047-269846955311674=/root/.ansible/tmp/ansible-tmp-1726882422.1453626-14047-269846955311674 <<< 13531 1726882422.17769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882422.17870: stderr chunk (state=3): >>><<< 13531 1726882422.17881: stdout chunk (state=3): >>><<< 13531 1726882422.17970: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882422.1453626-14047-269846955311674=/root/.ansible/tmp/ansible-tmp-1726882422.1453626-14047-269846955311674 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882422.17974: variable 'ansible_module_compression' from source: unknown 13531 1726882422.18236: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13531 1726882422.18239: variable 'ansible_facts' from source: unknown 13531 1726882422.18241: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882422.1453626-14047-269846955311674/AnsiballZ_stat.py 13531 1726882422.18307: Sending initial data 13531 1726882422.18310: Sent initial data (153 bytes) 13531 1726882422.19372: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882422.19391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.19405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.19422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.19473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.19484: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882422.19497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.19513: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882422.19524: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882422.19534: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882422.19548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.19566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.19581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.19592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.19602: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882422.19613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.19698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882422.19720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882422.19734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882422.19867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882422.21671: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882422.21769: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882422.21868: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpnqvzbido /root/.ansible/tmp/ansible-tmp-1726882422.1453626-14047-269846955311674/AnsiballZ_stat.py <<< 13531 1726882422.21962: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882422.23365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882422.23498: stderr chunk (state=3): >>><<< 13531 1726882422.23502: stdout chunk (state=3): >>><<< 13531 1726882422.23602: done transferring module to remote 13531 1726882422.23610: _low_level_execute_command(): starting 13531 1726882422.23613: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882422.1453626-14047-269846955311674/ /root/.ansible/tmp/ansible-tmp-1726882422.1453626-14047-269846955311674/AnsiballZ_stat.py && sleep 0' 13531 1726882422.25638: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882422.25755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.25773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.25791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.25834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.25909: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882422.25943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.26019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882422.26041: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882422.26077: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882422.26091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.26158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.26194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.26211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.26226: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882422.26242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.26374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882422.26445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882422.26602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882422.28373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882422.28455: stderr chunk (state=3): >>><<< 13531 1726882422.28459: stdout chunk (state=3): >>><<< 13531 1726882422.28551: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882422.28558: _low_level_execute_command(): starting 13531 1726882422.28562: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882422.1453626-14047-269846955311674/AnsiballZ_stat.py && sleep 0' 13531 1726882422.29418: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882422.29437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.29462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.29485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.29526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.29541: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882422.29582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.29601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882422.29614: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882422.29627: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882422.29639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.29659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.29683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.29696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.29708: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882422.29721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.29803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882422.29827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882422.29843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882422.29991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882422.43168: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26930, "dev": 21, "nlink": 1, "atime": 1726882420.7609563, "mtime": 1726882420.7609563, "ctime": 1726882420.7609563, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13531 1726882422.44271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882422.44276: stdout chunk (state=3): >>><<< 13531 1726882422.44292: stderr chunk (state=3): >>><<< 13531 1726882422.44456: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26930, "dev": 21, "nlink": 1, "atime": 1726882420.7609563, "mtime": 1726882420.7609563, "ctime": 1726882420.7609563, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882422.44468: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882422.1453626-14047-269846955311674/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882422.44471: _low_level_execute_command(): starting 13531 1726882422.44473: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882422.1453626-14047-269846955311674/ > /dev/null 2>&1 && sleep 0' 13531 1726882422.45089: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882422.45103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.45118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.45142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.45187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.45199: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882422.45212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.45229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882422.45247: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882422.45258: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882422.45273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.45286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.45301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.45312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.45323: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882422.45335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.45420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882422.45442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882422.45464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882422.45685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882422.47596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882422.47600: stdout chunk (state=3): >>><<< 13531 1726882422.47602: stderr chunk (state=3): >>><<< 13531 1726882422.47873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882422.47877: handler run complete 13531 1726882422.47880: attempt loop complete, returning result 13531 1726882422.47882: _execute() done 13531 1726882422.47884: dumping result to json 13531 1726882422.47886: done dumping result, returning 13531 1726882422.47888: done running TaskExecutor() for managed_node2/TASK: Get stat for interface test1 [0e448fcc-3ce9-4fd9-519d-000000000248] 13531 1726882422.47890: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000248 13531 1726882422.47968: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000248 13531 1726882422.47972: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882420.7609563, "block_size": 4096, "blocks": 0, "ctime": 1726882420.7609563, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26930, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726882420.7609563, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13531 1726882422.48072: no more pending results, returning what we have 13531 1726882422.48077: results queue empty 13531 1726882422.48078: checking for any_errors_fatal 13531 1726882422.48080: done checking for any_errors_fatal 13531 1726882422.48081: checking for max_fail_percentage 13531 1726882422.48084: done checking for max_fail_percentage 13531 1726882422.48085: checking to see if all hosts have failed and the running result is not ok 13531 1726882422.48086: done checking to see if all hosts have failed 13531 1726882422.48086: getting the remaining hosts for this loop 13531 1726882422.48088: done getting the remaining hosts for this loop 13531 1726882422.48092: getting the next task for host managed_node2 13531 1726882422.48100: done getting next task for host managed_node2 13531 1726882422.48103: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13531 1726882422.48106: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882422.48110: getting variables 13531 1726882422.48112: in VariableManager get_vars() 13531 1726882422.48168: Calling all_inventory to load vars for managed_node2 13531 1726882422.48176: Calling groups_inventory to load vars for managed_node2 13531 1726882422.48179: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882422.48191: Calling all_plugins_play to load vars for managed_node2 13531 1726882422.48194: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882422.48197: Calling groups_plugins_play to load vars for managed_node2 13531 1726882422.48536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882422.48916: done with get_vars() 13531 1726882422.48931: done getting variables 13531 1726882422.49028: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 13531 1726882422.49151: variable 'interface' from source: task vars 13531 1726882422.49155: variable 'dhcp_interface1' from source: play vars 13531 1726882422.49211: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:42 -0400 (0:00:00.403) 0:00:10.387 ****** 13531 1726882422.49241: entering _queue_task() for managed_node2/assert 13531 1726882422.49243: Creating lock for assert 13531 1726882422.49914: worker is 1 (out of 1 available) 13531 1726882422.49926: exiting _queue_task() for managed_node2/assert 13531 1726882422.49937: done queuing things up, now waiting for results queue to drain 13531 1726882422.49938: waiting for pending results... 13531 1726882422.50780: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test1' 13531 1726882422.50914: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000017 13531 1726882422.50936: variable 'ansible_search_path' from source: unknown 13531 1726882422.50943: variable 'ansible_search_path' from source: unknown 13531 1726882422.50996: calling self._execute() 13531 1726882422.51089: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.51100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.51118: variable 'omit' from source: magic vars 13531 1726882422.51500: variable 'ansible_distribution_major_version' from source: facts 13531 1726882422.51522: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882422.51532: variable 'omit' from source: magic vars 13531 1726882422.51588: variable 'omit' from source: magic vars 13531 1726882422.51705: variable 'interface' from source: task vars 13531 1726882422.51716: variable 'dhcp_interface1' from source: play vars 13531 1726882422.51791: variable 'dhcp_interface1' from source: play vars 13531 1726882422.51814: variable 'omit' from source: magic vars 13531 1726882422.51869: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882422.51912: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882422.51937: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882422.51965: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882422.51982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882422.52018: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882422.52028: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.52035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.52152: Set connection var ansible_pipelining to False 13531 1726882422.52167: Set connection var ansible_timeout to 10 13531 1726882422.52180: Set connection var ansible_shell_executable to /bin/sh 13531 1726882422.52190: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882422.52195: Set connection var ansible_connection to ssh 13531 1726882422.52203: Set connection var ansible_shell_type to sh 13531 1726882422.52236: variable 'ansible_shell_executable' from source: unknown 13531 1726882422.52245: variable 'ansible_connection' from source: unknown 13531 1726882422.52252: variable 'ansible_module_compression' from source: unknown 13531 1726882422.52260: variable 'ansible_shell_type' from source: unknown 13531 1726882422.52272: variable 'ansible_shell_executable' from source: unknown 13531 1726882422.52280: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.52287: variable 'ansible_pipelining' from source: unknown 13531 1726882422.52294: variable 'ansible_timeout' from source: unknown 13531 1726882422.52301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.52446: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882422.52462: variable 'omit' from source: magic vars 13531 1726882422.52474: starting attempt loop 13531 1726882422.52480: running the handler 13531 1726882422.52624: variable 'interface_stat' from source: set_fact 13531 1726882422.52652: Evaluated conditional (interface_stat.stat.exists): True 13531 1726882422.52666: handler run complete 13531 1726882422.52685: attempt loop complete, returning result 13531 1726882422.52691: _execute() done 13531 1726882422.52697: dumping result to json 13531 1726882422.52708: done dumping result, returning 13531 1726882422.52718: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test1' [0e448fcc-3ce9-4fd9-519d-000000000017] 13531 1726882422.52728: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000017 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882422.52874: no more pending results, returning what we have 13531 1726882422.52878: results queue empty 13531 1726882422.52879: checking for any_errors_fatal 13531 1726882422.52888: done checking for any_errors_fatal 13531 1726882422.52889: checking for max_fail_percentage 13531 1726882422.52891: done checking for max_fail_percentage 13531 1726882422.52892: checking to see if all hosts have failed and the running result is not ok 13531 1726882422.52892: done checking to see if all hosts have failed 13531 1726882422.52893: getting the remaining hosts for this loop 13531 1726882422.52895: done getting the remaining hosts for this loop 13531 1726882422.52898: getting the next task for host managed_node2 13531 1726882422.52907: done getting next task for host managed_node2 13531 1726882422.52910: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13531 1726882422.52913: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882422.52917: getting variables 13531 1726882422.52918: in VariableManager get_vars() 13531 1726882422.52979: Calling all_inventory to load vars for managed_node2 13531 1726882422.52982: Calling groups_inventory to load vars for managed_node2 13531 1726882422.52985: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882422.52997: Calling all_plugins_play to load vars for managed_node2 13531 1726882422.53000: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882422.53003: Calling groups_plugins_play to load vars for managed_node2 13531 1726882422.53193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882422.53409: done with get_vars() 13531 1726882422.53420: done getting variables 13531 1726882422.53627: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000017 13531 1726882422.53631: WORKER PROCESS EXITING TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:42 -0400 (0:00:00.044) 0:00:10.431 ****** 13531 1726882422.53645: entering _queue_task() for managed_node2/include_tasks 13531 1726882422.54074: worker is 1 (out of 1 available) 13531 1726882422.54085: exiting _queue_task() for managed_node2/include_tasks 13531 1726882422.54097: done queuing things up, now waiting for results queue to drain 13531 1726882422.54098: waiting for pending results... 13531 1726882422.54355: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 13531 1726882422.54485: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000001b 13531 1726882422.54509: variable 'ansible_search_path' from source: unknown 13531 1726882422.54518: variable 'ansible_search_path' from source: unknown 13531 1726882422.54566: calling self._execute() 13531 1726882422.54656: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.54671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.54687: variable 'omit' from source: magic vars 13531 1726882422.55138: variable 'ansible_distribution_major_version' from source: facts 13531 1726882422.55160: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882422.55173: _execute() done 13531 1726882422.55184: dumping result to json 13531 1726882422.55194: done dumping result, returning 13531 1726882422.55203: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-4fd9-519d-00000000001b] 13531 1726882422.55215: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000001b 13531 1726882422.55340: no more pending results, returning what we have 13531 1726882422.55346: in VariableManager get_vars() 13531 1726882422.55409: Calling all_inventory to load vars for managed_node2 13531 1726882422.55412: Calling groups_inventory to load vars for managed_node2 13531 1726882422.55415: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882422.55430: Calling all_plugins_play to load vars for managed_node2 13531 1726882422.55433: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882422.55437: Calling groups_plugins_play to load vars for managed_node2 13531 1726882422.55671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882422.55870: done with get_vars() 13531 1726882422.55883: variable 'ansible_search_path' from source: unknown 13531 1726882422.55885: variable 'ansible_search_path' from source: unknown 13531 1726882422.55923: we have included files to process 13531 1726882422.55925: generating all_blocks data 13531 1726882422.55926: done generating all_blocks data 13531 1726882422.55931: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13531 1726882422.55933: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13531 1726882422.55936: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13531 1726882422.56169: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000001b 13531 1726882422.56172: WORKER PROCESS EXITING 13531 1726882422.56395: done processing included file 13531 1726882422.56397: iterating over new_blocks loaded from include file 13531 1726882422.56399: in VariableManager get_vars() 13531 1726882422.56440: done with get_vars() 13531 1726882422.56442: filtering new block on tags 13531 1726882422.56458: done filtering new block on tags 13531 1726882422.56460: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 13531 1726882422.56466: extending task lists for all hosts with included blocks 13531 1726882422.56571: done extending task lists 13531 1726882422.56572: done processing included files 13531 1726882422.56573: results queue empty 13531 1726882422.56573: checking for any_errors_fatal 13531 1726882422.56577: done checking for any_errors_fatal 13531 1726882422.56578: checking for max_fail_percentage 13531 1726882422.56579: done checking for max_fail_percentage 13531 1726882422.56580: checking to see if all hosts have failed and the running result is not ok 13531 1726882422.56581: done checking to see if all hosts have failed 13531 1726882422.56582: getting the remaining hosts for this loop 13531 1726882422.56583: done getting the remaining hosts for this loop 13531 1726882422.56586: getting the next task for host managed_node2 13531 1726882422.56590: done getting next task for host managed_node2 13531 1726882422.56592: ^ task is: TASK: Get stat for interface {{ interface }} 13531 1726882422.56595: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882422.56597: getting variables 13531 1726882422.56598: in VariableManager get_vars() 13531 1726882422.56616: Calling all_inventory to load vars for managed_node2 13531 1726882422.56619: Calling groups_inventory to load vars for managed_node2 13531 1726882422.56621: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882422.56626: Calling all_plugins_play to load vars for managed_node2 13531 1726882422.56628: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882422.56631: Calling groups_plugins_play to load vars for managed_node2 13531 1726882422.56784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882422.57015: done with get_vars() 13531 1726882422.57023: done getting variables 13531 1726882422.57176: variable 'interface' from source: task vars 13531 1726882422.57180: variable 'dhcp_interface2' from source: play vars 13531 1726882422.57243: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:42 -0400 (0:00:00.036) 0:00:10.467 ****** 13531 1726882422.57274: entering _queue_task() for managed_node2/stat 13531 1726882422.57538: worker is 1 (out of 1 available) 13531 1726882422.57549: exiting _queue_task() for managed_node2/stat 13531 1726882422.57561: done queuing things up, now waiting for results queue to drain 13531 1726882422.57563: waiting for pending results... 13531 1726882422.57829: running TaskExecutor() for managed_node2/TASK: Get stat for interface test2 13531 1726882422.57974: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000260 13531 1726882422.57992: variable 'ansible_search_path' from source: unknown 13531 1726882422.58002: variable 'ansible_search_path' from source: unknown 13531 1726882422.58045: calling self._execute() 13531 1726882422.58140: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.58152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.58171: variable 'omit' from source: magic vars 13531 1726882422.58539: variable 'ansible_distribution_major_version' from source: facts 13531 1726882422.58561: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882422.58576: variable 'omit' from source: magic vars 13531 1726882422.58640: variable 'omit' from source: magic vars 13531 1726882422.58750: variable 'interface' from source: task vars 13531 1726882422.58760: variable 'dhcp_interface2' from source: play vars 13531 1726882422.58833: variable 'dhcp_interface2' from source: play vars 13531 1726882422.58857: variable 'omit' from source: magic vars 13531 1726882422.58910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882422.58954: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882422.58981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882422.59007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882422.59024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882422.59066: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882422.59076: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.59084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.59198: Set connection var ansible_pipelining to False 13531 1726882422.59214: Set connection var ansible_timeout to 10 13531 1726882422.59225: Set connection var ansible_shell_executable to /bin/sh 13531 1726882422.59235: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882422.59242: Set connection var ansible_connection to ssh 13531 1726882422.59249: Set connection var ansible_shell_type to sh 13531 1726882422.59285: variable 'ansible_shell_executable' from source: unknown 13531 1726882422.59293: variable 'ansible_connection' from source: unknown 13531 1726882422.59300: variable 'ansible_module_compression' from source: unknown 13531 1726882422.59308: variable 'ansible_shell_type' from source: unknown 13531 1726882422.59318: variable 'ansible_shell_executable' from source: unknown 13531 1726882422.59325: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.59333: variable 'ansible_pipelining' from source: unknown 13531 1726882422.59341: variable 'ansible_timeout' from source: unknown 13531 1726882422.59348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.59566: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882422.59584: variable 'omit' from source: magic vars 13531 1726882422.59599: starting attempt loop 13531 1726882422.59607: running the handler 13531 1726882422.59625: _low_level_execute_command(): starting 13531 1726882422.59641: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882422.60441: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882422.60458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.60479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.60500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.60550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.60567: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882422.60583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.60602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882422.60615: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882422.60632: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882422.60646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.60661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.60686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.60701: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.60713: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882422.60732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.60812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882422.60838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882422.60856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882422.60996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882422.62657: stdout chunk (state=3): >>>/root <<< 13531 1726882422.62755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882422.62851: stderr chunk (state=3): >>><<< 13531 1726882422.62866: stdout chunk (state=3): >>><<< 13531 1726882422.62973: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882422.62977: _low_level_execute_command(): starting 13531 1726882422.62981: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882422.6290288-14081-251521307249256 `" && echo ansible-tmp-1726882422.6290288-14081-251521307249256="` echo /root/.ansible/tmp/ansible-tmp-1726882422.6290288-14081-251521307249256 `" ) && sleep 0' 13531 1726882422.63643: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882422.63661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.63678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.63696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.63738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.63754: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882422.63775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.63793: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882422.63806: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882422.63817: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882422.63829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.63842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.63858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.63876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.63888: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882422.63901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.63973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882422.63999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882422.64016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882422.64149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882422.66057: stdout chunk (state=3): >>>ansible-tmp-1726882422.6290288-14081-251521307249256=/root/.ansible/tmp/ansible-tmp-1726882422.6290288-14081-251521307249256 <<< 13531 1726882422.66158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882422.66254: stderr chunk (state=3): >>><<< 13531 1726882422.66268: stdout chunk (state=3): >>><<< 13531 1726882422.66573: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882422.6290288-14081-251521307249256=/root/.ansible/tmp/ansible-tmp-1726882422.6290288-14081-251521307249256 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882422.66577: variable 'ansible_module_compression' from source: unknown 13531 1726882422.66580: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13531 1726882422.66582: variable 'ansible_facts' from source: unknown 13531 1726882422.66584: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882422.6290288-14081-251521307249256/AnsiballZ_stat.py 13531 1726882422.66719: Sending initial data 13531 1726882422.66723: Sent initial data (153 bytes) 13531 1726882422.67758: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882422.67782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.67798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.67815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.67859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.67875: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882422.67895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.67913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882422.67924: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882422.67934: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882422.67945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.67958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.67976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.67991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.68006: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882422.68020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.68100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882422.68129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882422.68145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882422.68278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882422.70045: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882422.70134: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882422.70232: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmps5lj2eeb /root/.ansible/tmp/ansible-tmp-1726882422.6290288-14081-251521307249256/AnsiballZ_stat.py <<< 13531 1726882422.70325: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882422.71624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882422.71895: stderr chunk (state=3): >>><<< 13531 1726882422.71898: stdout chunk (state=3): >>><<< 13531 1726882422.71900: done transferring module to remote 13531 1726882422.71906: _low_level_execute_command(): starting 13531 1726882422.71908: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882422.6290288-14081-251521307249256/ /root/.ansible/tmp/ansible-tmp-1726882422.6290288-14081-251521307249256/AnsiballZ_stat.py && sleep 0' 13531 1726882422.72507: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882422.72520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.72533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.72549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.72600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.72611: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882422.72624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.72640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882422.72650: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882422.72661: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882422.72680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.72693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.72706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.72717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.72726: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882422.72738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.72820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882422.72840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882422.72854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882422.72984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882422.74740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882422.74822: stderr chunk (state=3): >>><<< 13531 1726882422.74827: stdout chunk (state=3): >>><<< 13531 1726882422.74851: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882422.74856: _low_level_execute_command(): starting 13531 1726882422.74860: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882422.6290288-14081-251521307249256/AnsiballZ_stat.py && sleep 0' 13531 1726882422.75513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882422.75522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.75533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.75547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.75592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.75601: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882422.75609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.75623: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882422.75631: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882422.75637: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882422.75645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.75655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.75672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.75680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.75686: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882422.75695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.75768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882422.75787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882422.75798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882422.75934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882422.88944: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27516, "dev": 21, "nlink": 1, "atime": 1726882420.7679367, "mtime": 1726882420.7679367, "ctime": 1726882420.7679367, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13531 1726882422.89992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882422.90072: stderr chunk (state=3): >>><<< 13531 1726882422.90086: stdout chunk (state=3): >>><<< 13531 1726882422.90245: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27516, "dev": 21, "nlink": 1, "atime": 1726882420.7679367, "mtime": 1726882420.7679367, "ctime": 1726882420.7679367, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882422.90255: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882422.6290288-14081-251521307249256/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882422.90258: _low_level_execute_command(): starting 13531 1726882422.90260: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882422.6290288-14081-251521307249256/ > /dev/null 2>&1 && sleep 0' 13531 1726882422.90882: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882422.90899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.90920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.90937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.90983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.90995: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882422.91011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.91034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882422.91046: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882422.91056: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882422.91070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882422.91082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882422.91096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882422.91107: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882422.91120: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882422.91136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882422.91216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882422.91245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882422.91266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882422.91398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882422.93229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882422.93316: stderr chunk (state=3): >>><<< 13531 1726882422.93330: stdout chunk (state=3): >>><<< 13531 1726882422.93368: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882422.93371: handler run complete 13531 1726882422.93675: attempt loop complete, returning result 13531 1726882422.93678: _execute() done 13531 1726882422.93680: dumping result to json 13531 1726882422.93682: done dumping result, returning 13531 1726882422.93684: done running TaskExecutor() for managed_node2/TASK: Get stat for interface test2 [0e448fcc-3ce9-4fd9-519d-000000000260] 13531 1726882422.93686: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000260 13531 1726882422.93763: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000260 13531 1726882422.93768: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882420.7679367, "block_size": 4096, "blocks": 0, "ctime": 1726882420.7679367, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27516, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726882420.7679367, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13531 1726882422.93872: no more pending results, returning what we have 13531 1726882422.93876: results queue empty 13531 1726882422.93877: checking for any_errors_fatal 13531 1726882422.93879: done checking for any_errors_fatal 13531 1726882422.93879: checking for max_fail_percentage 13531 1726882422.93881: done checking for max_fail_percentage 13531 1726882422.93882: checking to see if all hosts have failed and the running result is not ok 13531 1726882422.93883: done checking to see if all hosts have failed 13531 1726882422.93883: getting the remaining hosts for this loop 13531 1726882422.93885: done getting the remaining hosts for this loop 13531 1726882422.93888: getting the next task for host managed_node2 13531 1726882422.93895: done getting next task for host managed_node2 13531 1726882422.93898: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13531 1726882422.93901: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882422.93904: getting variables 13531 1726882422.93906: in VariableManager get_vars() 13531 1726882422.93956: Calling all_inventory to load vars for managed_node2 13531 1726882422.93964: Calling groups_inventory to load vars for managed_node2 13531 1726882422.93967: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882422.93978: Calling all_plugins_play to load vars for managed_node2 13531 1726882422.93981: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882422.93984: Calling groups_plugins_play to load vars for managed_node2 13531 1726882422.94159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882422.94465: done with get_vars() 13531 1726882422.94476: done getting variables 13531 1726882422.94540: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882422.94672: variable 'interface' from source: task vars 13531 1726882422.94676: variable 'dhcp_interface2' from source: play vars 13531 1726882422.94741: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:42 -0400 (0:00:00.374) 0:00:10.842 ****** 13531 1726882422.94772: entering _queue_task() for managed_node2/assert 13531 1726882422.95040: worker is 1 (out of 1 available) 13531 1726882422.95052: exiting _queue_task() for managed_node2/assert 13531 1726882422.95063: done queuing things up, now waiting for results queue to drain 13531 1726882422.95065: waiting for pending results... 13531 1726882422.95325: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test2' 13531 1726882422.95444: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000001c 13531 1726882422.95462: variable 'ansible_search_path' from source: unknown 13531 1726882422.95476: variable 'ansible_search_path' from source: unknown 13531 1726882422.95518: calling self._execute() 13531 1726882422.95604: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.95619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.95632: variable 'omit' from source: magic vars 13531 1726882422.96074: variable 'ansible_distribution_major_version' from source: facts 13531 1726882422.96090: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882422.96100: variable 'omit' from source: magic vars 13531 1726882422.96157: variable 'omit' from source: magic vars 13531 1726882422.96262: variable 'interface' from source: task vars 13531 1726882422.96274: variable 'dhcp_interface2' from source: play vars 13531 1726882422.96343: variable 'dhcp_interface2' from source: play vars 13531 1726882422.96371: variable 'omit' from source: magic vars 13531 1726882422.96418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882422.96461: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882422.96494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882422.96514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882422.96527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882422.96568: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882422.96575: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.96583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.96687: Set connection var ansible_pipelining to False 13531 1726882422.96701: Set connection var ansible_timeout to 10 13531 1726882422.96711: Set connection var ansible_shell_executable to /bin/sh 13531 1726882422.96720: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882422.96726: Set connection var ansible_connection to ssh 13531 1726882422.96732: Set connection var ansible_shell_type to sh 13531 1726882422.96762: variable 'ansible_shell_executable' from source: unknown 13531 1726882422.96776: variable 'ansible_connection' from source: unknown 13531 1726882422.96785: variable 'ansible_module_compression' from source: unknown 13531 1726882422.96792: variable 'ansible_shell_type' from source: unknown 13531 1726882422.96798: variable 'ansible_shell_executable' from source: unknown 13531 1726882422.96808: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.96815: variable 'ansible_pipelining' from source: unknown 13531 1726882422.96823: variable 'ansible_timeout' from source: unknown 13531 1726882422.96830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.96971: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882422.96991: variable 'omit' from source: magic vars 13531 1726882422.97005: starting attempt loop 13531 1726882422.97011: running the handler 13531 1726882422.97153: variable 'interface_stat' from source: set_fact 13531 1726882422.97178: Evaluated conditional (interface_stat.stat.exists): True 13531 1726882422.97189: handler run complete 13531 1726882422.97214: attempt loop complete, returning result 13531 1726882422.97224: _execute() done 13531 1726882422.97231: dumping result to json 13531 1726882422.97242: done dumping result, returning 13531 1726882422.97252: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test2' [0e448fcc-3ce9-4fd9-519d-00000000001c] 13531 1726882422.97262: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000001c ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882422.97415: no more pending results, returning what we have 13531 1726882422.97419: results queue empty 13531 1726882422.97420: checking for any_errors_fatal 13531 1726882422.97430: done checking for any_errors_fatal 13531 1726882422.97431: checking for max_fail_percentage 13531 1726882422.97433: done checking for max_fail_percentage 13531 1726882422.97434: checking to see if all hosts have failed and the running result is not ok 13531 1726882422.97435: done checking to see if all hosts have failed 13531 1726882422.97436: getting the remaining hosts for this loop 13531 1726882422.97437: done getting the remaining hosts for this loop 13531 1726882422.97440: getting the next task for host managed_node2 13531 1726882422.97447: done getting next task for host managed_node2 13531 1726882422.97450: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 13531 1726882422.97452: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882422.97456: getting variables 13531 1726882422.97458: in VariableManager get_vars() 13531 1726882422.97516: Calling all_inventory to load vars for managed_node2 13531 1726882422.97519: Calling groups_inventory to load vars for managed_node2 13531 1726882422.97521: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882422.97532: Calling all_plugins_play to load vars for managed_node2 13531 1726882422.97534: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882422.97536: Calling groups_plugins_play to load vars for managed_node2 13531 1726882422.97738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882422.98230: done with get_vars() 13531 1726882422.98239: done getting variables 13531 1726882422.98277: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000001c 13531 1726882422.98280: WORKER PROCESS EXITING 13531 1726882422.98311: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:28 Friday 20 September 2024 21:33:42 -0400 (0:00:00.035) 0:00:10.878 ****** 13531 1726882422.98335: entering _queue_task() for managed_node2/command 13531 1726882422.98591: worker is 1 (out of 1 available) 13531 1726882422.98603: exiting _queue_task() for managed_node2/command 13531 1726882422.98614: done queuing things up, now waiting for results queue to drain 13531 1726882422.98616: waiting for pending results... 13531 1726882422.98886: running TaskExecutor() for managed_node2/TASK: Backup the /etc/resolv.conf for initscript 13531 1726882422.98988: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000001d 13531 1726882422.99009: variable 'ansible_search_path' from source: unknown 13531 1726882422.99049: calling self._execute() 13531 1726882422.99143: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882422.99157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882422.99177: variable 'omit' from source: magic vars 13531 1726882422.99557: variable 'ansible_distribution_major_version' from source: facts 13531 1726882422.99578: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882422.99702: variable 'network_provider' from source: set_fact 13531 1726882422.99718: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882422.99726: when evaluation is False, skipping this task 13531 1726882422.99733: _execute() done 13531 1726882422.99739: dumping result to json 13531 1726882422.99745: done dumping result, returning 13531 1726882422.99758: done running TaskExecutor() for managed_node2/TASK: Backup the /etc/resolv.conf for initscript [0e448fcc-3ce9-4fd9-519d-00000000001d] 13531 1726882422.99772: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000001d skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13531 1726882422.99916: no more pending results, returning what we have 13531 1726882422.99921: results queue empty 13531 1726882422.99922: checking for any_errors_fatal 13531 1726882422.99929: done checking for any_errors_fatal 13531 1726882422.99930: checking for max_fail_percentage 13531 1726882422.99932: done checking for max_fail_percentage 13531 1726882422.99933: checking to see if all hosts have failed and the running result is not ok 13531 1726882422.99934: done checking to see if all hosts have failed 13531 1726882422.99935: getting the remaining hosts for this loop 13531 1726882422.99937: done getting the remaining hosts for this loop 13531 1726882422.99940: getting the next task for host managed_node2 13531 1726882422.99946: done getting next task for host managed_node2 13531 1726882422.99948: ^ task is: TASK: TEST Add Bond with 2 ports 13531 1726882422.99951: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882422.99957: getting variables 13531 1726882422.99959: in VariableManager get_vars() 13531 1726882423.00020: Calling all_inventory to load vars for managed_node2 13531 1726882423.00023: Calling groups_inventory to load vars for managed_node2 13531 1726882423.00025: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882423.00039: Calling all_plugins_play to load vars for managed_node2 13531 1726882423.00042: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882423.00046: Calling groups_plugins_play to load vars for managed_node2 13531 1726882423.00244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882423.00454: done with get_vars() 13531 1726882423.00468: done getting variables 13531 1726882423.00551: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882423.00875: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000001d 13531 1726882423.00878: WORKER PROCESS EXITING TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:33 Friday 20 September 2024 21:33:43 -0400 (0:00:00.025) 0:00:10.904 ****** 13531 1726882423.00892: entering _queue_task() for managed_node2/debug 13531 1726882423.01165: worker is 1 (out of 1 available) 13531 1726882423.01179: exiting _queue_task() for managed_node2/debug 13531 1726882423.01191: done queuing things up, now waiting for results queue to drain 13531 1726882423.01193: waiting for pending results... 13531 1726882423.01479: running TaskExecutor() for managed_node2/TASK: TEST Add Bond with 2 ports 13531 1726882423.01577: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000001e 13531 1726882423.01595: variable 'ansible_search_path' from source: unknown 13531 1726882423.01641: calling self._execute() 13531 1726882423.01727: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882423.01737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882423.01756: variable 'omit' from source: magic vars 13531 1726882423.02183: variable 'ansible_distribution_major_version' from source: facts 13531 1726882423.02200: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882423.02211: variable 'omit' from source: magic vars 13531 1726882423.02234: variable 'omit' from source: magic vars 13531 1726882423.02275: variable 'omit' from source: magic vars 13531 1726882423.02326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882423.02366: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882423.02389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882423.02414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882423.02428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882423.02465: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882423.02473: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882423.02479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882423.02581: Set connection var ansible_pipelining to False 13531 1726882423.02591: Set connection var ansible_timeout to 10 13531 1726882423.02599: Set connection var ansible_shell_executable to /bin/sh 13531 1726882423.02607: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882423.02616: Set connection var ansible_connection to ssh 13531 1726882423.02622: Set connection var ansible_shell_type to sh 13531 1726882423.02652: variable 'ansible_shell_executable' from source: unknown 13531 1726882423.02665: variable 'ansible_connection' from source: unknown 13531 1726882423.02673: variable 'ansible_module_compression' from source: unknown 13531 1726882423.02680: variable 'ansible_shell_type' from source: unknown 13531 1726882423.02685: variable 'ansible_shell_executable' from source: unknown 13531 1726882423.02691: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882423.02696: variable 'ansible_pipelining' from source: unknown 13531 1726882423.02702: variable 'ansible_timeout' from source: unknown 13531 1726882423.02708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882423.02844: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882423.02863: variable 'omit' from source: magic vars 13531 1726882423.02875: starting attempt loop 13531 1726882423.02882: running the handler 13531 1726882423.02930: handler run complete 13531 1726882423.02958: attempt loop complete, returning result 13531 1726882423.02968: _execute() done 13531 1726882423.02974: dumping result to json 13531 1726882423.02981: done dumping result, returning 13531 1726882423.02992: done running TaskExecutor() for managed_node2/TASK: TEST Add Bond with 2 ports [0e448fcc-3ce9-4fd9-519d-00000000001e] 13531 1726882423.03002: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000001e ok: [managed_node2] => {} MSG: ################################################## 13531 1726882423.03150: no more pending results, returning what we have 13531 1726882423.03156: results queue empty 13531 1726882423.03157: checking for any_errors_fatal 13531 1726882423.03166: done checking for any_errors_fatal 13531 1726882423.03167: checking for max_fail_percentage 13531 1726882423.03169: done checking for max_fail_percentage 13531 1726882423.03170: checking to see if all hosts have failed and the running result is not ok 13531 1726882423.03171: done checking to see if all hosts have failed 13531 1726882423.03171: getting the remaining hosts for this loop 13531 1726882423.03173: done getting the remaining hosts for this loop 13531 1726882423.03176: getting the next task for host managed_node2 13531 1726882423.03183: done getting next task for host managed_node2 13531 1726882423.03189: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13531 1726882423.03192: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882423.03209: getting variables 13531 1726882423.03211: in VariableManager get_vars() 13531 1726882423.03277: Calling all_inventory to load vars for managed_node2 13531 1726882423.03280: Calling groups_inventory to load vars for managed_node2 13531 1726882423.03283: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882423.03294: Calling all_plugins_play to load vars for managed_node2 13531 1726882423.03297: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882423.03300: Calling groups_plugins_play to load vars for managed_node2 13531 1726882423.03530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882423.03744: done with get_vars() 13531 1726882423.03757: done getting variables 13531 1726882423.04059: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000001e 13531 1726882423.04062: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:43 -0400 (0:00:00.032) 0:00:10.936 ****** 13531 1726882423.04135: entering _queue_task() for managed_node2/include_tasks 13531 1726882423.04379: worker is 1 (out of 1 available) 13531 1726882423.04391: exiting _queue_task() for managed_node2/include_tasks 13531 1726882423.04403: done queuing things up, now waiting for results queue to drain 13531 1726882423.04404: waiting for pending results... 13531 1726882423.04673: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13531 1726882423.04812: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000026 13531 1726882423.04834: variable 'ansible_search_path' from source: unknown 13531 1726882423.04846: variable 'ansible_search_path' from source: unknown 13531 1726882423.04894: calling self._execute() 13531 1726882423.04987: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882423.04999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882423.05012: variable 'omit' from source: magic vars 13531 1726882423.05379: variable 'ansible_distribution_major_version' from source: facts 13531 1726882423.05400: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882423.05410: _execute() done 13531 1726882423.05418: dumping result to json 13531 1726882423.05424: done dumping result, returning 13531 1726882423.05435: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4fd9-519d-000000000026] 13531 1726882423.05447: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000026 13531 1726882423.05588: no more pending results, returning what we have 13531 1726882423.05594: in VariableManager get_vars() 13531 1726882423.05661: Calling all_inventory to load vars for managed_node2 13531 1726882423.05666: Calling groups_inventory to load vars for managed_node2 13531 1726882423.05668: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882423.05682: Calling all_plugins_play to load vars for managed_node2 13531 1726882423.05685: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882423.05689: Calling groups_plugins_play to load vars for managed_node2 13531 1726882423.05855: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000026 13531 1726882423.05858: WORKER PROCESS EXITING 13531 1726882423.05883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882423.06070: done with get_vars() 13531 1726882423.06078: variable 'ansible_search_path' from source: unknown 13531 1726882423.06080: variable 'ansible_search_path' from source: unknown 13531 1726882423.06118: we have included files to process 13531 1726882423.06119: generating all_blocks data 13531 1726882423.06121: done generating all_blocks data 13531 1726882423.06126: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882423.06127: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882423.06129: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882423.06862: done processing included file 13531 1726882423.06866: iterating over new_blocks loaded from include file 13531 1726882423.06868: in VariableManager get_vars() 13531 1726882423.06897: done with get_vars() 13531 1726882423.06898: filtering new block on tags 13531 1726882423.06911: done filtering new block on tags 13531 1726882423.06914: in VariableManager get_vars() 13531 1726882423.06931: done with get_vars() 13531 1726882423.06933: filtering new block on tags 13531 1726882423.06949: done filtering new block on tags 13531 1726882423.06954: in VariableManager get_vars() 13531 1726882423.06983: done with get_vars() 13531 1726882423.06984: filtering new block on tags 13531 1726882423.06996: done filtering new block on tags 13531 1726882423.06998: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 13531 1726882423.07002: extending task lists for all hosts with included blocks 13531 1726882423.07500: done extending task lists 13531 1726882423.07501: done processing included files 13531 1726882423.07502: results queue empty 13531 1726882423.07502: checking for any_errors_fatal 13531 1726882423.07504: done checking for any_errors_fatal 13531 1726882423.07505: checking for max_fail_percentage 13531 1726882423.07506: done checking for max_fail_percentage 13531 1726882423.07506: checking to see if all hosts have failed and the running result is not ok 13531 1726882423.07507: done checking to see if all hosts have failed 13531 1726882423.07507: getting the remaining hosts for this loop 13531 1726882423.07508: done getting the remaining hosts for this loop 13531 1726882423.07509: getting the next task for host managed_node2 13531 1726882423.07512: done getting next task for host managed_node2 13531 1726882423.07514: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13531 1726882423.07516: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882423.07522: getting variables 13531 1726882423.07523: in VariableManager get_vars() 13531 1726882423.07536: Calling all_inventory to load vars for managed_node2 13531 1726882423.07537: Calling groups_inventory to load vars for managed_node2 13531 1726882423.07538: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882423.07542: Calling all_plugins_play to load vars for managed_node2 13531 1726882423.07545: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882423.07546: Calling groups_plugins_play to load vars for managed_node2 13531 1726882423.07781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882423.07896: done with get_vars() 13531 1726882423.07902: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:33:43 -0400 (0:00:00.038) 0:00:10.974 ****** 13531 1726882423.07950: entering _queue_task() for managed_node2/setup 13531 1726882423.08178: worker is 1 (out of 1 available) 13531 1726882423.08191: exiting _queue_task() for managed_node2/setup 13531 1726882423.08203: done queuing things up, now waiting for results queue to drain 13531 1726882423.08204: waiting for pending results... 13531 1726882423.08378: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13531 1726882423.08476: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000027e 13531 1726882423.08486: variable 'ansible_search_path' from source: unknown 13531 1726882423.08489: variable 'ansible_search_path' from source: unknown 13531 1726882423.08521: calling self._execute() 13531 1726882423.08588: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882423.08591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882423.08599: variable 'omit' from source: magic vars 13531 1726882423.08993: variable 'ansible_distribution_major_version' from source: facts 13531 1726882423.09011: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882423.09256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882423.11032: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882423.11080: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882423.11107: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882423.11132: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882423.11156: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882423.11214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882423.11233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882423.11254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882423.11284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882423.11295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882423.11331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882423.11347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882423.11367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882423.11395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882423.11407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882423.11517: variable '__network_required_facts' from source: role '' defaults 13531 1726882423.11524: variable 'ansible_facts' from source: unknown 13531 1726882423.11582: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13531 1726882423.11588: when evaluation is False, skipping this task 13531 1726882423.11590: _execute() done 13531 1726882423.11593: dumping result to json 13531 1726882423.11594: done dumping result, returning 13531 1726882423.11600: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4fd9-519d-00000000027e] 13531 1726882423.11611: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000027e 13531 1726882423.11716: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000027e 13531 1726882423.11719: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882423.11802: no more pending results, returning what we have 13531 1726882423.11805: results queue empty 13531 1726882423.11806: checking for any_errors_fatal 13531 1726882423.11807: done checking for any_errors_fatal 13531 1726882423.11808: checking for max_fail_percentage 13531 1726882423.11810: done checking for max_fail_percentage 13531 1726882423.11810: checking to see if all hosts have failed and the running result is not ok 13531 1726882423.11811: done checking to see if all hosts have failed 13531 1726882423.11812: getting the remaining hosts for this loop 13531 1726882423.11813: done getting the remaining hosts for this loop 13531 1726882423.11817: getting the next task for host managed_node2 13531 1726882423.11826: done getting next task for host managed_node2 13531 1726882423.11831: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13531 1726882423.11835: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882423.11848: getting variables 13531 1726882423.11850: in VariableManager get_vars() 13531 1726882423.11906: Calling all_inventory to load vars for managed_node2 13531 1726882423.11909: Calling groups_inventory to load vars for managed_node2 13531 1726882423.11911: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882423.11920: Calling all_plugins_play to load vars for managed_node2 13531 1726882423.11922: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882423.11925: Calling groups_plugins_play to load vars for managed_node2 13531 1726882423.12115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882423.12362: done with get_vars() 13531 1726882423.12376: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:33:43 -0400 (0:00:00.045) 0:00:11.020 ****** 13531 1726882423.12489: entering _queue_task() for managed_node2/stat 13531 1726882423.12805: worker is 1 (out of 1 available) 13531 1726882423.12817: exiting _queue_task() for managed_node2/stat 13531 1726882423.12830: done queuing things up, now waiting for results queue to drain 13531 1726882423.12831: waiting for pending results... 13531 1726882423.13136: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 13531 1726882423.13311: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000280 13531 1726882423.13331: variable 'ansible_search_path' from source: unknown 13531 1726882423.13338: variable 'ansible_search_path' from source: unknown 13531 1726882423.13397: calling self._execute() 13531 1726882423.13505: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882423.13516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882423.13529: variable 'omit' from source: magic vars 13531 1726882423.14011: variable 'ansible_distribution_major_version' from source: facts 13531 1726882423.14027: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882423.14213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882423.14515: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882423.14568: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882423.14618: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882423.14658: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882423.14758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882423.14791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882423.14833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882423.14871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882423.14983: variable '__network_is_ostree' from source: set_fact 13531 1726882423.14995: Evaluated conditional (not __network_is_ostree is defined): False 13531 1726882423.15002: when evaluation is False, skipping this task 13531 1726882423.15008: _execute() done 13531 1726882423.15013: dumping result to json 13531 1726882423.15026: done dumping result, returning 13531 1726882423.15040: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4fd9-519d-000000000280] 13531 1726882423.15051: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000280 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13531 1726882423.15207: no more pending results, returning what we have 13531 1726882423.15211: results queue empty 13531 1726882423.15212: checking for any_errors_fatal 13531 1726882423.15218: done checking for any_errors_fatal 13531 1726882423.15219: checking for max_fail_percentage 13531 1726882423.15221: done checking for max_fail_percentage 13531 1726882423.15222: checking to see if all hosts have failed and the running result is not ok 13531 1726882423.15223: done checking to see if all hosts have failed 13531 1726882423.15223: getting the remaining hosts for this loop 13531 1726882423.15225: done getting the remaining hosts for this loop 13531 1726882423.15228: getting the next task for host managed_node2 13531 1726882423.15236: done getting next task for host managed_node2 13531 1726882423.15241: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13531 1726882423.15245: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882423.15262: getting variables 13531 1726882423.15264: in VariableManager get_vars() 13531 1726882423.15323: Calling all_inventory to load vars for managed_node2 13531 1726882423.15326: Calling groups_inventory to load vars for managed_node2 13531 1726882423.15329: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882423.15340: Calling all_plugins_play to load vars for managed_node2 13531 1726882423.15343: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882423.15346: Calling groups_plugins_play to load vars for managed_node2 13531 1726882423.15600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882423.15955: done with get_vars() 13531 1726882423.15970: done getting variables 13531 1726882423.16018: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000280 13531 1726882423.16021: WORKER PROCESS EXITING 13531 1726882423.16061: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:33:43 -0400 (0:00:00.036) 0:00:11.056 ****** 13531 1726882423.16106: entering _queue_task() for managed_node2/set_fact 13531 1726882423.16579: worker is 1 (out of 1 available) 13531 1726882423.16591: exiting _queue_task() for managed_node2/set_fact 13531 1726882423.16603: done queuing things up, now waiting for results queue to drain 13531 1726882423.16605: waiting for pending results... 13531 1726882423.16893: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13531 1726882423.17076: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000281 13531 1726882423.17097: variable 'ansible_search_path' from source: unknown 13531 1726882423.17106: variable 'ansible_search_path' from source: unknown 13531 1726882423.17161: calling self._execute() 13531 1726882423.17261: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882423.17277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882423.17292: variable 'omit' from source: magic vars 13531 1726882423.17719: variable 'ansible_distribution_major_version' from source: facts 13531 1726882423.17737: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882423.17936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882423.18243: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882423.18298: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882423.18346: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882423.18430: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882423.18528: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882423.18569: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882423.18606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882423.18639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882423.18742: variable '__network_is_ostree' from source: set_fact 13531 1726882423.18766: Evaluated conditional (not __network_is_ostree is defined): False 13531 1726882423.18774: when evaluation is False, skipping this task 13531 1726882423.18781: _execute() done 13531 1726882423.18786: dumping result to json 13531 1726882423.18796: done dumping result, returning 13531 1726882423.18808: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4fd9-519d-000000000281] 13531 1726882423.18819: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000281 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13531 1726882423.18971: no more pending results, returning what we have 13531 1726882423.18975: results queue empty 13531 1726882423.18976: checking for any_errors_fatal 13531 1726882423.18981: done checking for any_errors_fatal 13531 1726882423.18982: checking for max_fail_percentage 13531 1726882423.18984: done checking for max_fail_percentage 13531 1726882423.18985: checking to see if all hosts have failed and the running result is not ok 13531 1726882423.18986: done checking to see if all hosts have failed 13531 1726882423.18987: getting the remaining hosts for this loop 13531 1726882423.18988: done getting the remaining hosts for this loop 13531 1726882423.18992: getting the next task for host managed_node2 13531 1726882423.19002: done getting next task for host managed_node2 13531 1726882423.19006: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13531 1726882423.19010: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882423.19023: getting variables 13531 1726882423.19025: in VariableManager get_vars() 13531 1726882423.19086: Calling all_inventory to load vars for managed_node2 13531 1726882423.19088: Calling groups_inventory to load vars for managed_node2 13531 1726882423.19090: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882423.19101: Calling all_plugins_play to load vars for managed_node2 13531 1726882423.19104: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882423.19106: Calling groups_plugins_play to load vars for managed_node2 13531 1726882423.19311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882423.19544: done with get_vars() 13531 1726882423.19559: done getting variables 13531 1726882423.19719: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000281 13531 1726882423.19723: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:33:43 -0400 (0:00:00.036) 0:00:11.093 ****** 13531 1726882423.19806: entering _queue_task() for managed_node2/service_facts 13531 1726882423.19808: Creating lock for service_facts 13531 1726882423.20277: worker is 1 (out of 1 available) 13531 1726882423.20290: exiting _queue_task() for managed_node2/service_facts 13531 1726882423.20303: done queuing things up, now waiting for results queue to drain 13531 1726882423.20304: waiting for pending results... 13531 1726882423.20594: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 13531 1726882423.20761: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000283 13531 1726882423.20784: variable 'ansible_search_path' from source: unknown 13531 1726882423.20792: variable 'ansible_search_path' from source: unknown 13531 1726882423.20841: calling self._execute() 13531 1726882423.20938: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882423.20949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882423.20968: variable 'omit' from source: magic vars 13531 1726882423.21450: variable 'ansible_distribution_major_version' from source: facts 13531 1726882423.21473: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882423.21489: variable 'omit' from source: magic vars 13531 1726882423.21571: variable 'omit' from source: magic vars 13531 1726882423.21619: variable 'omit' from source: magic vars 13531 1726882423.21670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882423.21716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882423.21742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882423.21768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882423.21784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882423.21826: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882423.21838: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882423.21846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882423.21975: Set connection var ansible_pipelining to False 13531 1726882423.21987: Set connection var ansible_timeout to 10 13531 1726882423.21999: Set connection var ansible_shell_executable to /bin/sh 13531 1726882423.22009: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882423.22016: Set connection var ansible_connection to ssh 13531 1726882423.22031: Set connection var ansible_shell_type to sh 13531 1726882423.22073: variable 'ansible_shell_executable' from source: unknown 13531 1726882423.22081: variable 'ansible_connection' from source: unknown 13531 1726882423.22088: variable 'ansible_module_compression' from source: unknown 13531 1726882423.22094: variable 'ansible_shell_type' from source: unknown 13531 1726882423.22099: variable 'ansible_shell_executable' from source: unknown 13531 1726882423.22105: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882423.22112: variable 'ansible_pipelining' from source: unknown 13531 1726882423.22119: variable 'ansible_timeout' from source: unknown 13531 1726882423.22129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882423.22344: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882423.22372: variable 'omit' from source: magic vars 13531 1726882423.22385: starting attempt loop 13531 1726882423.22393: running the handler 13531 1726882423.22410: _low_level_execute_command(): starting 13531 1726882423.22421: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882423.23243: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882423.23263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882423.23280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882423.23298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882423.23342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882423.23367: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882423.23383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882423.23403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882423.23417: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882423.23429: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882423.23442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882423.23469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882423.23488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882423.23503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882423.23516: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882423.23531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882423.23622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882423.23647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882423.23672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882423.23824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882423.25509: stdout chunk (state=3): >>>/root <<< 13531 1726882423.25656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882423.25659: stderr chunk (state=3): >>><<< 13531 1726882423.25662: stdout chunk (state=3): >>><<< 13531 1726882423.25685: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882423.25696: _low_level_execute_command(): starting 13531 1726882423.25703: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882423.2568507-14101-240096637354878 `" && echo ansible-tmp-1726882423.2568507-14101-240096637354878="` echo /root/.ansible/tmp/ansible-tmp-1726882423.2568507-14101-240096637354878 `" ) && sleep 0' 13531 1726882423.26140: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882423.26144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882423.26179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882423.26186: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882423.26215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 13531 1726882423.26218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882423.26221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882423.26223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882423.26276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882423.26280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882423.26386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882423.28326: stdout chunk (state=3): >>>ansible-tmp-1726882423.2568507-14101-240096637354878=/root/.ansible/tmp/ansible-tmp-1726882423.2568507-14101-240096637354878 <<< 13531 1726882423.28432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882423.28485: stderr chunk (state=3): >>><<< 13531 1726882423.28488: stdout chunk (state=3): >>><<< 13531 1726882423.28501: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882423.2568507-14101-240096637354878=/root/.ansible/tmp/ansible-tmp-1726882423.2568507-14101-240096637354878 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882423.28539: variable 'ansible_module_compression' from source: unknown 13531 1726882423.28577: ANSIBALLZ: Using lock for service_facts 13531 1726882423.28580: ANSIBALLZ: Acquiring lock 13531 1726882423.28582: ANSIBALLZ: Lock acquired: 139969310554976 13531 1726882423.28585: ANSIBALLZ: Creating module 13531 1726882423.39193: ANSIBALLZ: Writing module into payload 13531 1726882423.39275: ANSIBALLZ: Writing module 13531 1726882423.39295: ANSIBALLZ: Renaming module 13531 1726882423.39300: ANSIBALLZ: Done creating module 13531 1726882423.39314: variable 'ansible_facts' from source: unknown 13531 1726882423.39367: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882423.2568507-14101-240096637354878/AnsiballZ_service_facts.py 13531 1726882423.39474: Sending initial data 13531 1726882423.39479: Sent initial data (162 bytes) 13531 1726882423.40224: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882423.40227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882423.40229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882423.40232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882423.40265: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882423.40273: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882423.40283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882423.40297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882423.40304: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882423.40311: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882423.40325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882423.40328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882423.40338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882423.40344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882423.40350: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882423.40364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882423.40440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882423.40455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882423.40475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882423.40615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882423.42448: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882423.42542: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882423.42641: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpcoutj53l /root/.ansible/tmp/ansible-tmp-1726882423.2568507-14101-240096637354878/AnsiballZ_service_facts.py <<< 13531 1726882423.42744: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882423.43787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882423.43899: stderr chunk (state=3): >>><<< 13531 1726882423.43903: stdout chunk (state=3): >>><<< 13531 1726882423.43917: done transferring module to remote 13531 1726882423.43927: _low_level_execute_command(): starting 13531 1726882423.43931: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882423.2568507-14101-240096637354878/ /root/.ansible/tmp/ansible-tmp-1726882423.2568507-14101-240096637354878/AnsiballZ_service_facts.py && sleep 0' 13531 1726882423.44390: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882423.44398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882423.44433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882423.44438: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 13531 1726882423.44446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882423.44451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882423.44475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882423.44478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882423.44525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882423.44529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882423.44542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882423.44651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882423.46385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882423.46432: stderr chunk (state=3): >>><<< 13531 1726882423.46435: stdout chunk (state=3): >>><<< 13531 1726882423.46456: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882423.46459: _low_level_execute_command(): starting 13531 1726882423.46462: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882423.2568507-14101-240096637354878/AnsiballZ_service_facts.py && sleep 0' 13531 1726882423.46899: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882423.46904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882423.46935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882423.46947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882423.47003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882423.47015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882423.47129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882424.82716: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 13531 1726882424.82751: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"n<<< 13531 1726882424.82759: stdout chunk (state=3): >>>ame": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", <<< 13531 1726882424.82761: stdout chunk (state=3): >>>"status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "sys<<< 13531 1726882424.82786: stdout chunk (state=3): >>>temd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13531 1726882424.84012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882424.84074: stderr chunk (state=3): >>><<< 13531 1726882424.84078: stdout chunk (state=3): >>><<< 13531 1726882424.84101: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882424.84444: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882423.2568507-14101-240096637354878/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882424.84454: _low_level_execute_command(): starting 13531 1726882424.84457: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882423.2568507-14101-240096637354878/ > /dev/null 2>&1 && sleep 0' 13531 1726882424.84925: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882424.84937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882424.84957: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882424.84972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882424.84983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882424.85028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882424.85039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882424.85152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882424.86955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882424.87008: stderr chunk (state=3): >>><<< 13531 1726882424.87011: stdout chunk (state=3): >>><<< 13531 1726882424.87023: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882424.87029: handler run complete 13531 1726882424.87158: variable 'ansible_facts' from source: unknown 13531 1726882424.87293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882424.87736: variable 'ansible_facts' from source: unknown 13531 1726882424.87870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882424.88066: attempt loop complete, returning result 13531 1726882424.88078: _execute() done 13531 1726882424.88085: dumping result to json 13531 1726882424.88143: done dumping result, returning 13531 1726882424.88161: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4fd9-519d-000000000283] 13531 1726882424.88176: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000283 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882424.88888: no more pending results, returning what we have 13531 1726882424.88890: results queue empty 13531 1726882424.88891: checking for any_errors_fatal 13531 1726882424.88896: done checking for any_errors_fatal 13531 1726882424.88896: checking for max_fail_percentage 13531 1726882424.88898: done checking for max_fail_percentage 13531 1726882424.88898: checking to see if all hosts have failed and the running result is not ok 13531 1726882424.88899: done checking to see if all hosts have failed 13531 1726882424.88900: getting the remaining hosts for this loop 13531 1726882424.88901: done getting the remaining hosts for this loop 13531 1726882424.88904: getting the next task for host managed_node2 13531 1726882424.88908: done getting next task for host managed_node2 13531 1726882424.88911: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13531 1726882424.88915: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882424.88922: getting variables 13531 1726882424.88924: in VariableManager get_vars() 13531 1726882424.88962: Calling all_inventory to load vars for managed_node2 13531 1726882424.88966: Calling groups_inventory to load vars for managed_node2 13531 1726882424.88968: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882424.88976: Calling all_plugins_play to load vars for managed_node2 13531 1726882424.88978: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882424.88981: Calling groups_plugins_play to load vars for managed_node2 13531 1726882424.89201: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000283 13531 1726882424.89204: WORKER PROCESS EXITING 13531 1726882424.89215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882424.89492: done with get_vars() 13531 1726882424.89501: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:33:44 -0400 (0:00:01.697) 0:00:12.790 ****** 13531 1726882424.89572: entering _queue_task() for managed_node2/package_facts 13531 1726882424.89576: Creating lock for package_facts 13531 1726882424.89779: worker is 1 (out of 1 available) 13531 1726882424.89791: exiting _queue_task() for managed_node2/package_facts 13531 1726882424.89803: done queuing things up, now waiting for results queue to drain 13531 1726882424.89805: waiting for pending results... 13531 1726882424.89975: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 13531 1726882424.90071: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000284 13531 1726882424.90082: variable 'ansible_search_path' from source: unknown 13531 1726882424.90087: variable 'ansible_search_path' from source: unknown 13531 1726882424.90119: calling self._execute() 13531 1726882424.90185: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882424.90189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882424.90198: variable 'omit' from source: magic vars 13531 1726882424.90545: variable 'ansible_distribution_major_version' from source: facts 13531 1726882424.90567: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882424.90578: variable 'omit' from source: magic vars 13531 1726882424.90649: variable 'omit' from source: magic vars 13531 1726882424.90687: variable 'omit' from source: magic vars 13531 1726882424.90728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882424.90772: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882424.90794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882424.90814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882424.90827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882424.90868: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882424.90876: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882424.90885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882424.90998: Set connection var ansible_pipelining to False 13531 1726882424.91009: Set connection var ansible_timeout to 10 13531 1726882424.91018: Set connection var ansible_shell_executable to /bin/sh 13531 1726882424.91027: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882424.91033: Set connection var ansible_connection to ssh 13531 1726882424.91044: Set connection var ansible_shell_type to sh 13531 1726882424.91086: variable 'ansible_shell_executable' from source: unknown 13531 1726882424.91095: variable 'ansible_connection' from source: unknown 13531 1726882424.91102: variable 'ansible_module_compression' from source: unknown 13531 1726882424.91108: variable 'ansible_shell_type' from source: unknown 13531 1726882424.91114: variable 'ansible_shell_executable' from source: unknown 13531 1726882424.91120: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882424.91126: variable 'ansible_pipelining' from source: unknown 13531 1726882424.91133: variable 'ansible_timeout' from source: unknown 13531 1726882424.91140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882424.91344: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882424.91365: variable 'omit' from source: magic vars 13531 1726882424.91375: starting attempt loop 13531 1726882424.91381: running the handler 13531 1726882424.91404: _low_level_execute_command(): starting 13531 1726882424.91417: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882424.92201: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882424.92215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882424.92230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882424.92247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882424.92304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882424.92315: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882424.92328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882424.92346: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882424.92360: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882424.92373: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882424.92385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882424.92403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882424.92419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882424.92431: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882424.92446: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882424.92465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882424.92547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882424.92576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882424.92592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882424.92729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882424.94397: stdout chunk (state=3): >>>/root <<< 13531 1726882424.94592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882424.94595: stdout chunk (state=3): >>><<< 13531 1726882424.94597: stderr chunk (state=3): >>><<< 13531 1726882424.94722: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882424.94727: _low_level_execute_command(): starting 13531 1726882424.94730: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882424.9462075-14151-73860293800356 `" && echo ansible-tmp-1726882424.9462075-14151-73860293800356="` echo /root/.ansible/tmp/ansible-tmp-1726882424.9462075-14151-73860293800356 `" ) && sleep 0' 13531 1726882424.95349: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882424.95372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882424.95389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882424.95407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882424.95451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882424.95476: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882424.95493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882424.95511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882424.95524: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882424.95536: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882424.95549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882424.95568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882424.95587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882424.95602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882424.95613: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882424.95627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882424.95708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882424.95725: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882424.95739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882424.95901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882424.97792: stdout chunk (state=3): >>>ansible-tmp-1726882424.9462075-14151-73860293800356=/root/.ansible/tmp/ansible-tmp-1726882424.9462075-14151-73860293800356 <<< 13531 1726882424.97898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882424.97961: stderr chunk (state=3): >>><<< 13531 1726882424.97965: stdout chunk (state=3): >>><<< 13531 1726882424.97980: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882424.9462075-14151-73860293800356=/root/.ansible/tmp/ansible-tmp-1726882424.9462075-14151-73860293800356 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882424.98017: variable 'ansible_module_compression' from source: unknown 13531 1726882424.98058: ANSIBALLZ: Using lock for package_facts 13531 1726882424.98061: ANSIBALLZ: Acquiring lock 13531 1726882424.98066: ANSIBALLZ: Lock acquired: 139969313625856 13531 1726882424.98068: ANSIBALLZ: Creating module 13531 1726882425.20146: ANSIBALLZ: Writing module into payload 13531 1726882425.20261: ANSIBALLZ: Writing module 13531 1726882425.20294: ANSIBALLZ: Renaming module 13531 1726882425.20299: ANSIBALLZ: Done creating module 13531 1726882425.20326: variable 'ansible_facts' from source: unknown 13531 1726882425.20461: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882424.9462075-14151-73860293800356/AnsiballZ_package_facts.py 13531 1726882425.20577: Sending initial data 13531 1726882425.20586: Sent initial data (161 bytes) 13531 1726882425.21279: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882425.21283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882425.21320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882425.21324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882425.21326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882425.21328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882425.21386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882425.21389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882425.21392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882425.21505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882425.23337: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882425.23432: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882425.23531: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpvr4x9k43 /root/.ansible/tmp/ansible-tmp-1726882424.9462075-14151-73860293800356/AnsiballZ_package_facts.py <<< 13531 1726882425.23626: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882425.25606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882425.25717: stderr chunk (state=3): >>><<< 13531 1726882425.25721: stdout chunk (state=3): >>><<< 13531 1726882425.25736: done transferring module to remote 13531 1726882425.25746: _low_level_execute_command(): starting 13531 1726882425.25754: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882424.9462075-14151-73860293800356/ /root/.ansible/tmp/ansible-tmp-1726882424.9462075-14151-73860293800356/AnsiballZ_package_facts.py && sleep 0' 13531 1726882425.26221: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882425.26224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882425.26266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882425.26270: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882425.26272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882425.26274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882425.26325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882425.26328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882425.26433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882425.28201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882425.28246: stderr chunk (state=3): >>><<< 13531 1726882425.28249: stdout chunk (state=3): >>><<< 13531 1726882425.28267: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882425.28270: _low_level_execute_command(): starting 13531 1726882425.28275: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882424.9462075-14151-73860293800356/AnsiballZ_package_facts.py && sleep 0' 13531 1726882425.28724: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882425.28727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882425.28759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882425.28762: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882425.28767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882425.28825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882425.28831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882425.28834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882425.28936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882425.75309: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "sour<<< 13531 1726882425.75376: stdout chunk (state=3): >>>ce": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "relea<<< 13531 1726882425.75385: stdout chunk (state=3): >>>se": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-<<< 13531 1726882425.75398: stdout chunk (state=3): >>>Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-s<<< 13531 1726882425.75420: stdout chunk (state=3): >>>tat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch":<<< 13531 1726882425.75447: stdout chunk (state=3): >>> 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyseri<<< 13531 1726882425.75450: stdout chunk (state=3): >>>al", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13531 1726882425.76895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882425.76948: stderr chunk (state=3): >>><<< 13531 1726882425.76952: stdout chunk (state=3): >>><<< 13531 1726882425.76986: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882425.79537: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882424.9462075-14151-73860293800356/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882425.79570: _low_level_execute_command(): starting 13531 1726882425.79581: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882424.9462075-14151-73860293800356/ > /dev/null 2>&1 && sleep 0' 13531 1726882425.80217: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882425.80232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882425.80248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882425.80270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882425.80317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882425.80331: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882425.80346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882425.80369: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882425.80383: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882425.80394: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882425.80405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882425.80416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882425.80431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882425.80443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882425.80453: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882425.80470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882425.80544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882425.80571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882425.80588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882425.80723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882425.82566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882425.82614: stderr chunk (state=3): >>><<< 13531 1726882425.82618: stdout chunk (state=3): >>><<< 13531 1726882425.82631: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882425.82636: handler run complete 13531 1726882425.83424: variable 'ansible_facts' from source: unknown 13531 1726882425.83918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882425.85129: variable 'ansible_facts' from source: unknown 13531 1726882425.85397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882425.85851: attempt loop complete, returning result 13531 1726882425.85867: _execute() done 13531 1726882425.85870: dumping result to json 13531 1726882425.86059: done dumping result, returning 13531 1726882425.86077: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4fd9-519d-000000000284] 13531 1726882425.86088: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000284 13531 1726882425.91244: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000284 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882425.91300: no more pending results, returning what we have 13531 1726882425.91303: results queue empty 13531 1726882425.91303: checking for any_errors_fatal 13531 1726882425.91308: done checking for any_errors_fatal 13531 1726882425.91308: checking for max_fail_percentage 13531 1726882425.91310: done checking for max_fail_percentage 13531 1726882425.91310: checking to see if all hosts have failed and the running result is not ok 13531 1726882425.91311: done checking to see if all hosts have failed 13531 1726882425.91312: getting the remaining hosts for this loop 13531 1726882425.91313: done getting the remaining hosts for this loop 13531 1726882425.91316: getting the next task for host managed_node2 13531 1726882425.91321: done getting next task for host managed_node2 13531 1726882425.91325: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13531 1726882425.91328: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882425.91336: getting variables 13531 1726882425.91337: in VariableManager get_vars() 13531 1726882425.91379: Calling all_inventory to load vars for managed_node2 13531 1726882425.91382: Calling groups_inventory to load vars for managed_node2 13531 1726882425.91384: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882425.91393: Calling all_plugins_play to load vars for managed_node2 13531 1726882425.91395: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882425.91398: Calling groups_plugins_play to load vars for managed_node2 13531 1726882425.92546: WORKER PROCESS EXITING 13531 1726882425.92554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882425.93504: done with get_vars() 13531 1726882425.93526: done getting variables 13531 1726882425.93573: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:45 -0400 (0:00:01.040) 0:00:13.831 ****** 13531 1726882425.93595: entering _queue_task() for managed_node2/debug 13531 1726882425.93807: worker is 1 (out of 1 available) 13531 1726882425.93819: exiting _queue_task() for managed_node2/debug 13531 1726882425.93831: done queuing things up, now waiting for results queue to drain 13531 1726882425.93832: waiting for pending results... 13531 1726882425.94000: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 13531 1726882425.94082: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000027 13531 1726882425.94092: variable 'ansible_search_path' from source: unknown 13531 1726882425.94096: variable 'ansible_search_path' from source: unknown 13531 1726882425.94128: calling self._execute() 13531 1726882425.94228: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882425.94239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882425.94250: variable 'omit' from source: magic vars 13531 1726882425.94669: variable 'ansible_distribution_major_version' from source: facts 13531 1726882425.94688: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882425.94698: variable 'omit' from source: magic vars 13531 1726882425.94766: variable 'omit' from source: magic vars 13531 1726882425.94881: variable 'network_provider' from source: set_fact 13531 1726882425.94903: variable 'omit' from source: magic vars 13531 1726882425.94966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882425.95002: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882425.95027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882425.95068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882425.95078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882425.95100: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882425.95103: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882425.95106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882425.95208: Set connection var ansible_pipelining to False 13531 1726882425.95212: Set connection var ansible_timeout to 10 13531 1726882425.95217: Set connection var ansible_shell_executable to /bin/sh 13531 1726882425.95222: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882425.95225: Set connection var ansible_connection to ssh 13531 1726882425.95227: Set connection var ansible_shell_type to sh 13531 1726882425.95246: variable 'ansible_shell_executable' from source: unknown 13531 1726882425.95249: variable 'ansible_connection' from source: unknown 13531 1726882425.95252: variable 'ansible_module_compression' from source: unknown 13531 1726882425.95254: variable 'ansible_shell_type' from source: unknown 13531 1726882425.95260: variable 'ansible_shell_executable' from source: unknown 13531 1726882425.95264: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882425.95267: variable 'ansible_pipelining' from source: unknown 13531 1726882425.95274: variable 'ansible_timeout' from source: unknown 13531 1726882425.95276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882425.95383: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882425.95392: variable 'omit' from source: magic vars 13531 1726882425.95397: starting attempt loop 13531 1726882425.95399: running the handler 13531 1726882425.95433: handler run complete 13531 1726882425.95444: attempt loop complete, returning result 13531 1726882425.95446: _execute() done 13531 1726882425.95448: dumping result to json 13531 1726882425.95451: done dumping result, returning 13531 1726882425.95461: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4fd9-519d-000000000027] 13531 1726882425.95468: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000027 13531 1726882425.95545: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000027 13531 1726882425.95547: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 13531 1726882425.95604: no more pending results, returning what we have 13531 1726882425.95607: results queue empty 13531 1726882425.95608: checking for any_errors_fatal 13531 1726882425.95615: done checking for any_errors_fatal 13531 1726882425.95616: checking for max_fail_percentage 13531 1726882425.95618: done checking for max_fail_percentage 13531 1726882425.95619: checking to see if all hosts have failed and the running result is not ok 13531 1726882425.95619: done checking to see if all hosts have failed 13531 1726882425.95620: getting the remaining hosts for this loop 13531 1726882425.95621: done getting the remaining hosts for this loop 13531 1726882425.95624: getting the next task for host managed_node2 13531 1726882425.95630: done getting next task for host managed_node2 13531 1726882425.95633: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13531 1726882425.95638: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882425.95647: getting variables 13531 1726882425.95649: in VariableManager get_vars() 13531 1726882425.95696: Calling all_inventory to load vars for managed_node2 13531 1726882425.95699: Calling groups_inventory to load vars for managed_node2 13531 1726882425.95701: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882425.95709: Calling all_plugins_play to load vars for managed_node2 13531 1726882425.95711: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882425.95714: Calling groups_plugins_play to load vars for managed_node2 13531 1726882425.96566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882425.97496: done with get_vars() 13531 1726882425.97512: done getting variables 13531 1726882425.97578: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:45 -0400 (0:00:00.040) 0:00:13.871 ****** 13531 1726882425.97601: entering _queue_task() for managed_node2/fail 13531 1726882425.97602: Creating lock for fail 13531 1726882425.97812: worker is 1 (out of 1 available) 13531 1726882425.97824: exiting _queue_task() for managed_node2/fail 13531 1726882425.97836: done queuing things up, now waiting for results queue to drain 13531 1726882425.97838: waiting for pending results... 13531 1726882425.98016: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13531 1726882425.98098: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000028 13531 1726882425.98108: variable 'ansible_search_path' from source: unknown 13531 1726882425.98112: variable 'ansible_search_path' from source: unknown 13531 1726882425.98142: calling self._execute() 13531 1726882425.98208: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882425.98211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882425.98219: variable 'omit' from source: magic vars 13531 1726882425.98495: variable 'ansible_distribution_major_version' from source: facts 13531 1726882425.98506: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882425.98589: variable 'network_state' from source: role '' defaults 13531 1726882425.98597: Evaluated conditional (network_state != {}): False 13531 1726882425.98600: when evaluation is False, skipping this task 13531 1726882425.98602: _execute() done 13531 1726882425.98606: dumping result to json 13531 1726882425.98609: done dumping result, returning 13531 1726882425.98614: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4fd9-519d-000000000028] 13531 1726882425.98623: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000028 13531 1726882425.98710: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000028 13531 1726882425.98713: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882425.98762: no more pending results, returning what we have 13531 1726882425.98767: results queue empty 13531 1726882425.98768: checking for any_errors_fatal 13531 1726882425.98774: done checking for any_errors_fatal 13531 1726882425.98774: checking for max_fail_percentage 13531 1726882425.98776: done checking for max_fail_percentage 13531 1726882425.98777: checking to see if all hosts have failed and the running result is not ok 13531 1726882425.98778: done checking to see if all hosts have failed 13531 1726882425.98778: getting the remaining hosts for this loop 13531 1726882425.98780: done getting the remaining hosts for this loop 13531 1726882425.98783: getting the next task for host managed_node2 13531 1726882425.98789: done getting next task for host managed_node2 13531 1726882425.98793: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13531 1726882425.98796: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882425.98810: getting variables 13531 1726882425.98812: in VariableManager get_vars() 13531 1726882425.98857: Calling all_inventory to load vars for managed_node2 13531 1726882425.98860: Calling groups_inventory to load vars for managed_node2 13531 1726882425.98862: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882425.98872: Calling all_plugins_play to load vars for managed_node2 13531 1726882425.98875: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882425.98877: Calling groups_plugins_play to load vars for managed_node2 13531 1726882425.99674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882426.00620: done with get_vars() 13531 1726882426.00640: done getting variables 13531 1726882426.00687: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:46 -0400 (0:00:00.031) 0:00:13.902 ****** 13531 1726882426.00710: entering _queue_task() for managed_node2/fail 13531 1726882426.00934: worker is 1 (out of 1 available) 13531 1726882426.00947: exiting _queue_task() for managed_node2/fail 13531 1726882426.00960: done queuing things up, now waiting for results queue to drain 13531 1726882426.00962: waiting for pending results... 13531 1726882426.01139: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13531 1726882426.01235: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000029 13531 1726882426.01245: variable 'ansible_search_path' from source: unknown 13531 1726882426.01248: variable 'ansible_search_path' from source: unknown 13531 1726882426.01283: calling self._execute() 13531 1726882426.01350: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882426.01355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882426.01363: variable 'omit' from source: magic vars 13531 1726882426.01628: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.01638: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882426.01722: variable 'network_state' from source: role '' defaults 13531 1726882426.01731: Evaluated conditional (network_state != {}): False 13531 1726882426.01734: when evaluation is False, skipping this task 13531 1726882426.01737: _execute() done 13531 1726882426.01739: dumping result to json 13531 1726882426.01742: done dumping result, returning 13531 1726882426.01751: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4fd9-519d-000000000029] 13531 1726882426.01757: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000029 13531 1726882426.01842: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000029 13531 1726882426.01845: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882426.01897: no more pending results, returning what we have 13531 1726882426.01901: results queue empty 13531 1726882426.01902: checking for any_errors_fatal 13531 1726882426.01908: done checking for any_errors_fatal 13531 1726882426.01908: checking for max_fail_percentage 13531 1726882426.01910: done checking for max_fail_percentage 13531 1726882426.01911: checking to see if all hosts have failed and the running result is not ok 13531 1726882426.01912: done checking to see if all hosts have failed 13531 1726882426.01913: getting the remaining hosts for this loop 13531 1726882426.01914: done getting the remaining hosts for this loop 13531 1726882426.01917: getting the next task for host managed_node2 13531 1726882426.01923: done getting next task for host managed_node2 13531 1726882426.01926: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13531 1726882426.01930: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882426.01944: getting variables 13531 1726882426.01945: in VariableManager get_vars() 13531 1726882426.02000: Calling all_inventory to load vars for managed_node2 13531 1726882426.02003: Calling groups_inventory to load vars for managed_node2 13531 1726882426.02005: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882426.02013: Calling all_plugins_play to load vars for managed_node2 13531 1726882426.02015: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882426.02018: Calling groups_plugins_play to load vars for managed_node2 13531 1726882426.02884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882426.03813: done with get_vars() 13531 1726882426.03831: done getting variables 13531 1726882426.03877: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:46 -0400 (0:00:00.031) 0:00:13.934 ****** 13531 1726882426.03901: entering _queue_task() for managed_node2/fail 13531 1726882426.04124: worker is 1 (out of 1 available) 13531 1726882426.04137: exiting _queue_task() for managed_node2/fail 13531 1726882426.04150: done queuing things up, now waiting for results queue to drain 13531 1726882426.04151: waiting for pending results... 13531 1726882426.04333: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13531 1726882426.04424: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000002a 13531 1726882426.04435: variable 'ansible_search_path' from source: unknown 13531 1726882426.04438: variable 'ansible_search_path' from source: unknown 13531 1726882426.04473: calling self._execute() 13531 1726882426.04537: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882426.04541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882426.04548: variable 'omit' from source: magic vars 13531 1726882426.04819: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.04829: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882426.04951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882426.06549: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882426.06603: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882426.06630: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882426.06663: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882426.06684: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882426.06743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.06768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.06786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.06812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.06822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.06895: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.06907: Evaluated conditional (ansible_distribution_major_version | int > 9): False 13531 1726882426.06910: when evaluation is False, skipping this task 13531 1726882426.06913: _execute() done 13531 1726882426.06916: dumping result to json 13531 1726882426.06919: done dumping result, returning 13531 1726882426.06925: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4fd9-519d-00000000002a] 13531 1726882426.06931: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000002a 13531 1726882426.07023: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000002a 13531 1726882426.07026: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 13531 1726882426.07075: no more pending results, returning what we have 13531 1726882426.07078: results queue empty 13531 1726882426.07082: checking for any_errors_fatal 13531 1726882426.07090: done checking for any_errors_fatal 13531 1726882426.07091: checking for max_fail_percentage 13531 1726882426.07092: done checking for max_fail_percentage 13531 1726882426.07093: checking to see if all hosts have failed and the running result is not ok 13531 1726882426.07094: done checking to see if all hosts have failed 13531 1726882426.07094: getting the remaining hosts for this loop 13531 1726882426.07096: done getting the remaining hosts for this loop 13531 1726882426.07100: getting the next task for host managed_node2 13531 1726882426.07105: done getting next task for host managed_node2 13531 1726882426.07109: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13531 1726882426.07113: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882426.07125: getting variables 13531 1726882426.07127: in VariableManager get_vars() 13531 1726882426.07182: Calling all_inventory to load vars for managed_node2 13531 1726882426.07184: Calling groups_inventory to load vars for managed_node2 13531 1726882426.07187: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882426.07200: Calling all_plugins_play to load vars for managed_node2 13531 1726882426.07202: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882426.07205: Calling groups_plugins_play to load vars for managed_node2 13531 1726882426.08012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882426.09035: done with get_vars() 13531 1726882426.09050: done getting variables 13531 1726882426.09123: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:46 -0400 (0:00:00.052) 0:00:13.986 ****** 13531 1726882426.09146: entering _queue_task() for managed_node2/dnf 13531 1726882426.09367: worker is 1 (out of 1 available) 13531 1726882426.09380: exiting _queue_task() for managed_node2/dnf 13531 1726882426.09392: done queuing things up, now waiting for results queue to drain 13531 1726882426.09393: waiting for pending results... 13531 1726882426.09573: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13531 1726882426.09650: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000002b 13531 1726882426.09665: variable 'ansible_search_path' from source: unknown 13531 1726882426.09671: variable 'ansible_search_path' from source: unknown 13531 1726882426.09699: calling self._execute() 13531 1726882426.09762: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882426.09767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882426.09776: variable 'omit' from source: magic vars 13531 1726882426.10037: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.10048: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882426.10184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882426.12327: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882426.12385: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882426.12414: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882426.12439: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882426.12460: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882426.12524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.12542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.12564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.12593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.12608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.12698: variable 'ansible_distribution' from source: facts 13531 1726882426.12702: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.12718: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13531 1726882426.12796: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882426.12884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.12900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.12918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.12946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.12958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.12988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.13004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.13020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.13049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.13062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.13090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.13105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.13121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.13148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.13161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.13259: variable 'network_connections' from source: task vars 13531 1726882426.13270: variable 'controller_profile' from source: play vars 13531 1726882426.13314: variable 'controller_profile' from source: play vars 13531 1726882426.13322: variable 'controller_device' from source: play vars 13531 1726882426.13368: variable 'controller_device' from source: play vars 13531 1726882426.13376: variable 'port1_profile' from source: play vars 13531 1726882426.13417: variable 'port1_profile' from source: play vars 13531 1726882426.13424: variable 'dhcp_interface1' from source: play vars 13531 1726882426.13469: variable 'dhcp_interface1' from source: play vars 13531 1726882426.13481: variable 'controller_profile' from source: play vars 13531 1726882426.13523: variable 'controller_profile' from source: play vars 13531 1726882426.13529: variable 'port2_profile' from source: play vars 13531 1726882426.13573: variable 'port2_profile' from source: play vars 13531 1726882426.13581: variable 'dhcp_interface2' from source: play vars 13531 1726882426.13625: variable 'dhcp_interface2' from source: play vars 13531 1726882426.13630: variable 'controller_profile' from source: play vars 13531 1726882426.13674: variable 'controller_profile' from source: play vars 13531 1726882426.13725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882426.13867: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882426.13894: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882426.13918: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882426.13955: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882426.13989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882426.14024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882426.14053: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.14075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882426.14132: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882426.14318: variable 'network_connections' from source: task vars 13531 1726882426.14323: variable 'controller_profile' from source: play vars 13531 1726882426.14372: variable 'controller_profile' from source: play vars 13531 1726882426.14378: variable 'controller_device' from source: play vars 13531 1726882426.14429: variable 'controller_device' from source: play vars 13531 1726882426.14436: variable 'port1_profile' from source: play vars 13531 1726882426.14492: variable 'port1_profile' from source: play vars 13531 1726882426.14500: variable 'dhcp_interface1' from source: play vars 13531 1726882426.14541: variable 'dhcp_interface1' from source: play vars 13531 1726882426.14546: variable 'controller_profile' from source: play vars 13531 1726882426.14609: variable 'controller_profile' from source: play vars 13531 1726882426.14620: variable 'port2_profile' from source: play vars 13531 1726882426.14683: variable 'port2_profile' from source: play vars 13531 1726882426.14707: variable 'dhcp_interface2' from source: play vars 13531 1726882426.14773: variable 'dhcp_interface2' from source: play vars 13531 1726882426.14788: variable 'controller_profile' from source: play vars 13531 1726882426.14847: variable 'controller_profile' from source: play vars 13531 1726882426.14887: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882426.14899: when evaluation is False, skipping this task 13531 1726882426.14907: _execute() done 13531 1726882426.14913: dumping result to json 13531 1726882426.14920: done dumping result, returning 13531 1726882426.14931: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-00000000002b] 13531 1726882426.14941: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000002b skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882426.15104: no more pending results, returning what we have 13531 1726882426.15108: results queue empty 13531 1726882426.15109: checking for any_errors_fatal 13531 1726882426.15117: done checking for any_errors_fatal 13531 1726882426.15118: checking for max_fail_percentage 13531 1726882426.15120: done checking for max_fail_percentage 13531 1726882426.15121: checking to see if all hosts have failed and the running result is not ok 13531 1726882426.15121: done checking to see if all hosts have failed 13531 1726882426.15122: getting the remaining hosts for this loop 13531 1726882426.15124: done getting the remaining hosts for this loop 13531 1726882426.15127: getting the next task for host managed_node2 13531 1726882426.15133: done getting next task for host managed_node2 13531 1726882426.15138: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13531 1726882426.15141: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882426.15154: getting variables 13531 1726882426.15156: in VariableManager get_vars() 13531 1726882426.15218: Calling all_inventory to load vars for managed_node2 13531 1726882426.15221: Calling groups_inventory to load vars for managed_node2 13531 1726882426.15224: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882426.15235: Calling all_plugins_play to load vars for managed_node2 13531 1726882426.15238: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882426.15241: Calling groups_plugins_play to load vars for managed_node2 13531 1726882426.16224: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000002b 13531 1726882426.16228: WORKER PROCESS EXITING 13531 1726882426.16410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882426.17342: done with get_vars() 13531 1726882426.17360: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13531 1726882426.17415: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:46 -0400 (0:00:00.082) 0:00:14.069 ****** 13531 1726882426.17436: entering _queue_task() for managed_node2/yum 13531 1726882426.17437: Creating lock for yum 13531 1726882426.17714: worker is 1 (out of 1 available) 13531 1726882426.17725: exiting _queue_task() for managed_node2/yum 13531 1726882426.17736: done queuing things up, now waiting for results queue to drain 13531 1726882426.17737: waiting for pending results... 13531 1726882426.18017: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13531 1726882426.18139: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000002c 13531 1726882426.18158: variable 'ansible_search_path' from source: unknown 13531 1726882426.18168: variable 'ansible_search_path' from source: unknown 13531 1726882426.18214: calling self._execute() 13531 1726882426.18303: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882426.18313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882426.18324: variable 'omit' from source: magic vars 13531 1726882426.18683: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.18702: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882426.18895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882426.21473: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882426.21541: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882426.21588: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882426.21643: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882426.21682: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882426.21761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.21798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.21827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.21876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.21898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.21994: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.22014: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13531 1726882426.22022: when evaluation is False, skipping this task 13531 1726882426.22029: _execute() done 13531 1726882426.22034: dumping result to json 13531 1726882426.22041: done dumping result, returning 13531 1726882426.22052: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-00000000002c] 13531 1726882426.22065: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000002c 13531 1726882426.22176: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000002c 13531 1726882426.22183: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13531 1726882426.22254: no more pending results, returning what we have 13531 1726882426.22258: results queue empty 13531 1726882426.22259: checking for any_errors_fatal 13531 1726882426.22267: done checking for any_errors_fatal 13531 1726882426.22268: checking for max_fail_percentage 13531 1726882426.22270: done checking for max_fail_percentage 13531 1726882426.22271: checking to see if all hosts have failed and the running result is not ok 13531 1726882426.22272: done checking to see if all hosts have failed 13531 1726882426.22272: getting the remaining hosts for this loop 13531 1726882426.22274: done getting the remaining hosts for this loop 13531 1726882426.22277: getting the next task for host managed_node2 13531 1726882426.22284: done getting next task for host managed_node2 13531 1726882426.22288: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13531 1726882426.22291: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882426.22305: getting variables 13531 1726882426.22307: in VariableManager get_vars() 13531 1726882426.22367: Calling all_inventory to load vars for managed_node2 13531 1726882426.22370: Calling groups_inventory to load vars for managed_node2 13531 1726882426.22373: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882426.22384: Calling all_plugins_play to load vars for managed_node2 13531 1726882426.22387: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882426.22390: Calling groups_plugins_play to load vars for managed_node2 13531 1726882426.24154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882426.25799: done with get_vars() 13531 1726882426.25830: done getting variables 13531 1726882426.25896: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:46 -0400 (0:00:00.084) 0:00:14.154 ****** 13531 1726882426.25931: entering _queue_task() for managed_node2/fail 13531 1726882426.26258: worker is 1 (out of 1 available) 13531 1726882426.26271: exiting _queue_task() for managed_node2/fail 13531 1726882426.26282: done queuing things up, now waiting for results queue to drain 13531 1726882426.26283: waiting for pending results... 13531 1726882426.26570: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13531 1726882426.26705: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000002d 13531 1726882426.26729: variable 'ansible_search_path' from source: unknown 13531 1726882426.26736: variable 'ansible_search_path' from source: unknown 13531 1726882426.26779: calling self._execute() 13531 1726882426.26866: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882426.26877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882426.26889: variable 'omit' from source: magic vars 13531 1726882426.27244: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.27268: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882426.27402: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882426.27632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882426.29481: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882426.29534: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882426.29562: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882426.29590: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882426.29610: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882426.29672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.29694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.29713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.29740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.29751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.29787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.29806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.29825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.29850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.29862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.29916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.29933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.29950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.29978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.29988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.30146: variable 'network_connections' from source: task vars 13531 1726882426.30161: variable 'controller_profile' from source: play vars 13531 1726882426.30232: variable 'controller_profile' from source: play vars 13531 1726882426.30246: variable 'controller_device' from source: play vars 13531 1726882426.30309: variable 'controller_device' from source: play vars 13531 1726882426.30323: variable 'port1_profile' from source: play vars 13531 1726882426.30380: variable 'port1_profile' from source: play vars 13531 1726882426.30386: variable 'dhcp_interface1' from source: play vars 13531 1726882426.30428: variable 'dhcp_interface1' from source: play vars 13531 1726882426.30433: variable 'controller_profile' from source: play vars 13531 1726882426.30479: variable 'controller_profile' from source: play vars 13531 1726882426.30486: variable 'port2_profile' from source: play vars 13531 1726882426.30527: variable 'port2_profile' from source: play vars 13531 1726882426.30533: variable 'dhcp_interface2' from source: play vars 13531 1726882426.30580: variable 'dhcp_interface2' from source: play vars 13531 1726882426.30583: variable 'controller_profile' from source: play vars 13531 1726882426.30624: variable 'controller_profile' from source: play vars 13531 1726882426.30678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882426.30794: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882426.30821: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882426.30843: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882426.30866: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882426.30898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882426.30914: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882426.30931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.30948: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882426.31013: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882426.31255: variable 'network_connections' from source: task vars 13531 1726882426.31267: variable 'controller_profile' from source: play vars 13531 1726882426.31326: variable 'controller_profile' from source: play vars 13531 1726882426.31337: variable 'controller_device' from source: play vars 13531 1726882426.31394: variable 'controller_device' from source: play vars 13531 1726882426.31405: variable 'port1_profile' from source: play vars 13531 1726882426.31459: variable 'port1_profile' from source: play vars 13531 1726882426.31471: variable 'dhcp_interface1' from source: play vars 13531 1726882426.31525: variable 'dhcp_interface1' from source: play vars 13531 1726882426.31535: variable 'controller_profile' from source: play vars 13531 1726882426.31590: variable 'controller_profile' from source: play vars 13531 1726882426.31600: variable 'port2_profile' from source: play vars 13531 1726882426.31653: variable 'port2_profile' from source: play vars 13531 1726882426.31665: variable 'dhcp_interface2' from source: play vars 13531 1726882426.31720: variable 'dhcp_interface2' from source: play vars 13531 1726882426.31729: variable 'controller_profile' from source: play vars 13531 1726882426.31785: variable 'controller_profile' from source: play vars 13531 1726882426.31815: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882426.31822: when evaluation is False, skipping this task 13531 1726882426.31827: _execute() done 13531 1726882426.31831: dumping result to json 13531 1726882426.31836: done dumping result, returning 13531 1726882426.31845: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-00000000002d] 13531 1726882426.31853: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000002d 13531 1726882426.31954: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000002d 13531 1726882426.31957: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882426.32004: no more pending results, returning what we have 13531 1726882426.32007: results queue empty 13531 1726882426.32008: checking for any_errors_fatal 13531 1726882426.32013: done checking for any_errors_fatal 13531 1726882426.32014: checking for max_fail_percentage 13531 1726882426.32016: done checking for max_fail_percentage 13531 1726882426.32017: checking to see if all hosts have failed and the running result is not ok 13531 1726882426.32018: done checking to see if all hosts have failed 13531 1726882426.32018: getting the remaining hosts for this loop 13531 1726882426.32019: done getting the remaining hosts for this loop 13531 1726882426.32023: getting the next task for host managed_node2 13531 1726882426.32028: done getting next task for host managed_node2 13531 1726882426.32032: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13531 1726882426.32035: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882426.32048: getting variables 13531 1726882426.32049: in VariableManager get_vars() 13531 1726882426.32104: Calling all_inventory to load vars for managed_node2 13531 1726882426.32106: Calling groups_inventory to load vars for managed_node2 13531 1726882426.32108: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882426.32118: Calling all_plugins_play to load vars for managed_node2 13531 1726882426.32121: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882426.32123: Calling groups_plugins_play to load vars for managed_node2 13531 1726882426.32958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882426.36346: done with get_vars() 13531 1726882426.36363: done getting variables 13531 1726882426.36397: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:46 -0400 (0:00:00.104) 0:00:14.259 ****** 13531 1726882426.36418: entering _queue_task() for managed_node2/package 13531 1726882426.36644: worker is 1 (out of 1 available) 13531 1726882426.36656: exiting _queue_task() for managed_node2/package 13531 1726882426.36672: done queuing things up, now waiting for results queue to drain 13531 1726882426.36673: waiting for pending results... 13531 1726882426.36844: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 13531 1726882426.36935: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000002e 13531 1726882426.36945: variable 'ansible_search_path' from source: unknown 13531 1726882426.36949: variable 'ansible_search_path' from source: unknown 13531 1726882426.36986: calling self._execute() 13531 1726882426.37049: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882426.37053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882426.37069: variable 'omit' from source: magic vars 13531 1726882426.37333: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.37346: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882426.37484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882426.37699: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882426.37742: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882426.37798: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882426.37835: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882426.37944: variable 'network_packages' from source: role '' defaults 13531 1726882426.38043: variable '__network_provider_setup' from source: role '' defaults 13531 1726882426.38057: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882426.38118: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882426.38130: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882426.38190: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882426.38354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882426.40199: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882426.40240: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882426.40277: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882426.40305: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882426.40324: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882426.40389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.40409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.40426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.40452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.40467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.40500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.40517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.40533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.40561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.40572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.40715: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13531 1726882426.40790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.40806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.40826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.40850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.40864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.40923: variable 'ansible_python' from source: facts 13531 1726882426.40945: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13531 1726882426.41004: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882426.41063: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882426.41147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.41166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.41183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.41207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.41217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.41247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.41275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.41291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.41315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.41325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.41423: variable 'network_connections' from source: task vars 13531 1726882426.41427: variable 'controller_profile' from source: play vars 13531 1726882426.41501: variable 'controller_profile' from source: play vars 13531 1726882426.41509: variable 'controller_device' from source: play vars 13531 1726882426.41581: variable 'controller_device' from source: play vars 13531 1726882426.41590: variable 'port1_profile' from source: play vars 13531 1726882426.41654: variable 'port1_profile' from source: play vars 13531 1726882426.41666: variable 'dhcp_interface1' from source: play vars 13531 1726882426.41732: variable 'dhcp_interface1' from source: play vars 13531 1726882426.41739: variable 'controller_profile' from source: play vars 13531 1726882426.41809: variable 'controller_profile' from source: play vars 13531 1726882426.41820: variable 'port2_profile' from source: play vars 13531 1726882426.41889: variable 'port2_profile' from source: play vars 13531 1726882426.41897: variable 'dhcp_interface2' from source: play vars 13531 1726882426.41969: variable 'dhcp_interface2' from source: play vars 13531 1726882426.41976: variable 'controller_profile' from source: play vars 13531 1726882426.42043: variable 'controller_profile' from source: play vars 13531 1726882426.42093: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882426.42112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882426.42134: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.42160: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882426.42198: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882426.42378: variable 'network_connections' from source: task vars 13531 1726882426.42381: variable 'controller_profile' from source: play vars 13531 1726882426.42451: variable 'controller_profile' from source: play vars 13531 1726882426.42460: variable 'controller_device' from source: play vars 13531 1726882426.42525: variable 'controller_device' from source: play vars 13531 1726882426.42534: variable 'port1_profile' from source: play vars 13531 1726882426.42605: variable 'port1_profile' from source: play vars 13531 1726882426.42612: variable 'dhcp_interface1' from source: play vars 13531 1726882426.42684: variable 'dhcp_interface1' from source: play vars 13531 1726882426.42691: variable 'controller_profile' from source: play vars 13531 1726882426.42755: variable 'controller_profile' from source: play vars 13531 1726882426.42766: variable 'port2_profile' from source: play vars 13531 1726882426.42835: variable 'port2_profile' from source: play vars 13531 1726882426.42842: variable 'dhcp_interface2' from source: play vars 13531 1726882426.42914: variable 'dhcp_interface2' from source: play vars 13531 1726882426.42921: variable 'controller_profile' from source: play vars 13531 1726882426.42990: variable 'controller_profile' from source: play vars 13531 1726882426.43029: variable '__network_packages_default_wireless' from source: role '' defaults 13531 1726882426.43085: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882426.43290: variable 'network_connections' from source: task vars 13531 1726882426.43293: variable 'controller_profile' from source: play vars 13531 1726882426.43340: variable 'controller_profile' from source: play vars 13531 1726882426.43345: variable 'controller_device' from source: play vars 13531 1726882426.43393: variable 'controller_device' from source: play vars 13531 1726882426.43400: variable 'port1_profile' from source: play vars 13531 1726882426.43446: variable 'port1_profile' from source: play vars 13531 1726882426.43452: variable 'dhcp_interface1' from source: play vars 13531 1726882426.43500: variable 'dhcp_interface1' from source: play vars 13531 1726882426.43505: variable 'controller_profile' from source: play vars 13531 1726882426.43551: variable 'controller_profile' from source: play vars 13531 1726882426.43558: variable 'port2_profile' from source: play vars 13531 1726882426.43603: variable 'port2_profile' from source: play vars 13531 1726882426.43609: variable 'dhcp_interface2' from source: play vars 13531 1726882426.43653: variable 'dhcp_interface2' from source: play vars 13531 1726882426.43661: variable 'controller_profile' from source: play vars 13531 1726882426.43707: variable 'controller_profile' from source: play vars 13531 1726882426.43725: variable '__network_packages_default_team' from source: role '' defaults 13531 1726882426.43782: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882426.43978: variable 'network_connections' from source: task vars 13531 1726882426.43985: variable 'controller_profile' from source: play vars 13531 1726882426.44024: variable 'controller_profile' from source: play vars 13531 1726882426.44030: variable 'controller_device' from source: play vars 13531 1726882426.44078: variable 'controller_device' from source: play vars 13531 1726882426.44088: variable 'port1_profile' from source: play vars 13531 1726882426.44135: variable 'port1_profile' from source: play vars 13531 1726882426.44140: variable 'dhcp_interface1' from source: play vars 13531 1726882426.44188: variable 'dhcp_interface1' from source: play vars 13531 1726882426.44196: variable 'controller_profile' from source: play vars 13531 1726882426.44239: variable 'controller_profile' from source: play vars 13531 1726882426.44245: variable 'port2_profile' from source: play vars 13531 1726882426.44294: variable 'port2_profile' from source: play vars 13531 1726882426.44304: variable 'dhcp_interface2' from source: play vars 13531 1726882426.44346: variable 'dhcp_interface2' from source: play vars 13531 1726882426.44351: variable 'controller_profile' from source: play vars 13531 1726882426.44398: variable 'controller_profile' from source: play vars 13531 1726882426.44444: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882426.44489: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882426.44494: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882426.44538: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882426.44673: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13531 1726882426.44961: variable 'network_connections' from source: task vars 13531 1726882426.44968: variable 'controller_profile' from source: play vars 13531 1726882426.45008: variable 'controller_profile' from source: play vars 13531 1726882426.45016: variable 'controller_device' from source: play vars 13531 1726882426.45054: variable 'controller_device' from source: play vars 13531 1726882426.45065: variable 'port1_profile' from source: play vars 13531 1726882426.45107: variable 'port1_profile' from source: play vars 13531 1726882426.45113: variable 'dhcp_interface1' from source: play vars 13531 1726882426.45153: variable 'dhcp_interface1' from source: play vars 13531 1726882426.45160: variable 'controller_profile' from source: play vars 13531 1726882426.45204: variable 'controller_profile' from source: play vars 13531 1726882426.45210: variable 'port2_profile' from source: play vars 13531 1726882426.45250: variable 'port2_profile' from source: play vars 13531 1726882426.45258: variable 'dhcp_interface2' from source: play vars 13531 1726882426.45302: variable 'dhcp_interface2' from source: play vars 13531 1726882426.45307: variable 'controller_profile' from source: play vars 13531 1726882426.45348: variable 'controller_profile' from source: play vars 13531 1726882426.45354: variable 'ansible_distribution' from source: facts 13531 1726882426.45360: variable '__network_rh_distros' from source: role '' defaults 13531 1726882426.45367: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.45388: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13531 1726882426.45495: variable 'ansible_distribution' from source: facts 13531 1726882426.45498: variable '__network_rh_distros' from source: role '' defaults 13531 1726882426.45503: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.45511: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13531 1726882426.45617: variable 'ansible_distribution' from source: facts 13531 1726882426.45620: variable '__network_rh_distros' from source: role '' defaults 13531 1726882426.45623: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.45669: variable 'network_provider' from source: set_fact 13531 1726882426.45721: variable 'ansible_facts' from source: unknown 13531 1726882426.46163: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13531 1726882426.46166: when evaluation is False, skipping this task 13531 1726882426.46170: _execute() done 13531 1726882426.46172: dumping result to json 13531 1726882426.46174: done dumping result, returning 13531 1726882426.46181: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4fd9-519d-00000000002e] 13531 1726882426.46186: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000002e skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13531 1726882426.46322: no more pending results, returning what we have 13531 1726882426.46325: results queue empty 13531 1726882426.46326: checking for any_errors_fatal 13531 1726882426.46331: done checking for any_errors_fatal 13531 1726882426.46332: checking for max_fail_percentage 13531 1726882426.46333: done checking for max_fail_percentage 13531 1726882426.46334: checking to see if all hosts have failed and the running result is not ok 13531 1726882426.46335: done checking to see if all hosts have failed 13531 1726882426.46335: getting the remaining hosts for this loop 13531 1726882426.46337: done getting the remaining hosts for this loop 13531 1726882426.46340: getting the next task for host managed_node2 13531 1726882426.46346: done getting next task for host managed_node2 13531 1726882426.46350: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13531 1726882426.46352: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882426.46367: getting variables 13531 1726882426.46369: in VariableManager get_vars() 13531 1726882426.46420: Calling all_inventory to load vars for managed_node2 13531 1726882426.46422: Calling groups_inventory to load vars for managed_node2 13531 1726882426.46425: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882426.46435: Calling all_plugins_play to load vars for managed_node2 13531 1726882426.46437: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882426.46439: Calling groups_plugins_play to load vars for managed_node2 13531 1726882426.47061: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000002e 13531 1726882426.47068: WORKER PROCESS EXITING 13531 1726882426.47467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882426.48859: done with get_vars() 13531 1726882426.48879: done getting variables 13531 1726882426.48922: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:46 -0400 (0:00:00.125) 0:00:14.384 ****** 13531 1726882426.48946: entering _queue_task() for managed_node2/package 13531 1726882426.49172: worker is 1 (out of 1 available) 13531 1726882426.49184: exiting _queue_task() for managed_node2/package 13531 1726882426.49195: done queuing things up, now waiting for results queue to drain 13531 1726882426.49197: waiting for pending results... 13531 1726882426.49372: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13531 1726882426.49455: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000002f 13531 1726882426.49468: variable 'ansible_search_path' from source: unknown 13531 1726882426.49472: variable 'ansible_search_path' from source: unknown 13531 1726882426.49502: calling self._execute() 13531 1726882426.49574: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882426.49577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882426.49585: variable 'omit' from source: magic vars 13531 1726882426.49859: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.49873: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882426.49960: variable 'network_state' from source: role '' defaults 13531 1726882426.49965: Evaluated conditional (network_state != {}): False 13531 1726882426.49968: when evaluation is False, skipping this task 13531 1726882426.49972: _execute() done 13531 1726882426.49980: dumping result to json 13531 1726882426.49983: done dumping result, returning 13531 1726882426.49990: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4fd9-519d-00000000002f] 13531 1726882426.49996: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000002f 13531 1726882426.50092: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000002f 13531 1726882426.50095: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882426.50137: no more pending results, returning what we have 13531 1726882426.50141: results queue empty 13531 1726882426.50142: checking for any_errors_fatal 13531 1726882426.50149: done checking for any_errors_fatal 13531 1726882426.50150: checking for max_fail_percentage 13531 1726882426.50152: done checking for max_fail_percentage 13531 1726882426.50155: checking to see if all hosts have failed and the running result is not ok 13531 1726882426.50155: done checking to see if all hosts have failed 13531 1726882426.50156: getting the remaining hosts for this loop 13531 1726882426.50157: done getting the remaining hosts for this loop 13531 1726882426.50161: getting the next task for host managed_node2 13531 1726882426.50168: done getting next task for host managed_node2 13531 1726882426.50172: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13531 1726882426.50178: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882426.50193: getting variables 13531 1726882426.50194: in VariableManager get_vars() 13531 1726882426.50238: Calling all_inventory to load vars for managed_node2 13531 1726882426.50240: Calling groups_inventory to load vars for managed_node2 13531 1726882426.50242: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882426.50251: Calling all_plugins_play to load vars for managed_node2 13531 1726882426.50256: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882426.50259: Calling groups_plugins_play to load vars for managed_node2 13531 1726882426.51736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882426.53496: done with get_vars() 13531 1726882426.53527: done getting variables 13531 1726882426.53587: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:46 -0400 (0:00:00.046) 0:00:14.431 ****** 13531 1726882426.53619: entering _queue_task() for managed_node2/package 13531 1726882426.53912: worker is 1 (out of 1 available) 13531 1726882426.53923: exiting _queue_task() for managed_node2/package 13531 1726882426.53935: done queuing things up, now waiting for results queue to drain 13531 1726882426.53937: waiting for pending results... 13531 1726882426.54219: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13531 1726882426.54376: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000030 13531 1726882426.54397: variable 'ansible_search_path' from source: unknown 13531 1726882426.54404: variable 'ansible_search_path' from source: unknown 13531 1726882426.54446: calling self._execute() 13531 1726882426.54545: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882426.54559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882426.54574: variable 'omit' from source: magic vars 13531 1726882426.54955: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.54975: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882426.55108: variable 'network_state' from source: role '' defaults 13531 1726882426.55122: Evaluated conditional (network_state != {}): False 13531 1726882426.55131: when evaluation is False, skipping this task 13531 1726882426.55142: _execute() done 13531 1726882426.55150: dumping result to json 13531 1726882426.55161: done dumping result, returning 13531 1726882426.55175: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4fd9-519d-000000000030] 13531 1726882426.55187: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000030 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882426.55322: no more pending results, returning what we have 13531 1726882426.55327: results queue empty 13531 1726882426.55328: checking for any_errors_fatal 13531 1726882426.55337: done checking for any_errors_fatal 13531 1726882426.55338: checking for max_fail_percentage 13531 1726882426.55340: done checking for max_fail_percentage 13531 1726882426.55340: checking to see if all hosts have failed and the running result is not ok 13531 1726882426.55341: done checking to see if all hosts have failed 13531 1726882426.55342: getting the remaining hosts for this loop 13531 1726882426.55344: done getting the remaining hosts for this loop 13531 1726882426.55347: getting the next task for host managed_node2 13531 1726882426.55357: done getting next task for host managed_node2 13531 1726882426.55362: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13531 1726882426.55368: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882426.55384: getting variables 13531 1726882426.55386: in VariableManager get_vars() 13531 1726882426.55445: Calling all_inventory to load vars for managed_node2 13531 1726882426.55448: Calling groups_inventory to load vars for managed_node2 13531 1726882426.55450: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882426.55468: Calling all_plugins_play to load vars for managed_node2 13531 1726882426.55471: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882426.55475: Calling groups_plugins_play to load vars for managed_node2 13531 1726882426.56537: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000030 13531 1726882426.56541: WORKER PROCESS EXITING 13531 1726882426.57218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882426.60213: done with get_vars() 13531 1726882426.60236: done getting variables 13531 1726882426.60346: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:46 -0400 (0:00:00.067) 0:00:14.499 ****** 13531 1726882426.60383: entering _queue_task() for managed_node2/service 13531 1726882426.60385: Creating lock for service 13531 1726882426.60838: worker is 1 (out of 1 available) 13531 1726882426.60851: exiting _queue_task() for managed_node2/service 13531 1726882426.60869: done queuing things up, now waiting for results queue to drain 13531 1726882426.60871: waiting for pending results... 13531 1726882426.61209: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13531 1726882426.61341: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000031 13531 1726882426.61362: variable 'ansible_search_path' from source: unknown 13531 1726882426.61372: variable 'ansible_search_path' from source: unknown 13531 1726882426.61410: calling self._execute() 13531 1726882426.61512: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882426.61522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882426.61536: variable 'omit' from source: magic vars 13531 1726882426.61920: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.61936: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882426.62225: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882426.62436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882426.65332: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882426.65418: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882426.65461: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882426.65503: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882426.65537: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882426.65622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.65662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.65698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.65749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.65774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.65824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.65861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.65894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.65940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.65967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.66007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.66032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.66061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.66104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.66120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.66291: variable 'network_connections' from source: task vars 13531 1726882426.66305: variable 'controller_profile' from source: play vars 13531 1726882426.66379: variable 'controller_profile' from source: play vars 13531 1726882426.66395: variable 'controller_device' from source: play vars 13531 1726882426.66456: variable 'controller_device' from source: play vars 13531 1726882426.66474: variable 'port1_profile' from source: play vars 13531 1726882426.66534: variable 'port1_profile' from source: play vars 13531 1726882426.66545: variable 'dhcp_interface1' from source: play vars 13531 1726882426.66613: variable 'dhcp_interface1' from source: play vars 13531 1726882426.66625: variable 'controller_profile' from source: play vars 13531 1726882426.66692: variable 'controller_profile' from source: play vars 13531 1726882426.66705: variable 'port2_profile' from source: play vars 13531 1726882426.66776: variable 'port2_profile' from source: play vars 13531 1726882426.66789: variable 'dhcp_interface2' from source: play vars 13531 1726882426.66859: variable 'dhcp_interface2' from source: play vars 13531 1726882426.66873: variable 'controller_profile' from source: play vars 13531 1726882426.66936: variable 'controller_profile' from source: play vars 13531 1726882426.67015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882426.67205: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882426.67243: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882426.67279: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882426.67307: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882426.67348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882426.67376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882426.67403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.67428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882426.67504: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882426.67750: variable 'network_connections' from source: task vars 13531 1726882426.67762: variable 'controller_profile' from source: play vars 13531 1726882426.67830: variable 'controller_profile' from source: play vars 13531 1726882426.67841: variable 'controller_device' from source: play vars 13531 1726882426.67905: variable 'controller_device' from source: play vars 13531 1726882426.67920: variable 'port1_profile' from source: play vars 13531 1726882426.67986: variable 'port1_profile' from source: play vars 13531 1726882426.67997: variable 'dhcp_interface1' from source: play vars 13531 1726882426.68066: variable 'dhcp_interface1' from source: play vars 13531 1726882426.68077: variable 'controller_profile' from source: play vars 13531 1726882426.68139: variable 'controller_profile' from source: play vars 13531 1726882426.68156: variable 'port2_profile' from source: play vars 13531 1726882426.68219: variable 'port2_profile' from source: play vars 13531 1726882426.68232: variable 'dhcp_interface2' from source: play vars 13531 1726882426.68303: variable 'dhcp_interface2' from source: play vars 13531 1726882426.68315: variable 'controller_profile' from source: play vars 13531 1726882426.68384: variable 'controller_profile' from source: play vars 13531 1726882426.68422: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882426.68429: when evaluation is False, skipping this task 13531 1726882426.68437: _execute() done 13531 1726882426.68443: dumping result to json 13531 1726882426.68451: done dumping result, returning 13531 1726882426.68467: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-000000000031] 13531 1726882426.68479: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000031 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882426.68617: no more pending results, returning what we have 13531 1726882426.68621: results queue empty 13531 1726882426.68622: checking for any_errors_fatal 13531 1726882426.68627: done checking for any_errors_fatal 13531 1726882426.68628: checking for max_fail_percentage 13531 1726882426.68630: done checking for max_fail_percentage 13531 1726882426.68630: checking to see if all hosts have failed and the running result is not ok 13531 1726882426.68631: done checking to see if all hosts have failed 13531 1726882426.68632: getting the remaining hosts for this loop 13531 1726882426.68633: done getting the remaining hosts for this loop 13531 1726882426.68636: getting the next task for host managed_node2 13531 1726882426.68643: done getting next task for host managed_node2 13531 1726882426.68646: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13531 1726882426.68649: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882426.68669: getting variables 13531 1726882426.68671: in VariableManager get_vars() 13531 1726882426.68725: Calling all_inventory to load vars for managed_node2 13531 1726882426.68727: Calling groups_inventory to load vars for managed_node2 13531 1726882426.68730: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882426.68740: Calling all_plugins_play to load vars for managed_node2 13531 1726882426.68743: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882426.68745: Calling groups_plugins_play to load vars for managed_node2 13531 1726882426.70083: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000031 13531 1726882426.70087: WORKER PROCESS EXITING 13531 1726882426.70496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882426.72188: done with get_vars() 13531 1726882426.72217: done getting variables 13531 1726882426.72287: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:46 -0400 (0:00:00.119) 0:00:14.618 ****** 13531 1726882426.72321: entering _queue_task() for managed_node2/service 13531 1726882426.73120: worker is 1 (out of 1 available) 13531 1726882426.73134: exiting _queue_task() for managed_node2/service 13531 1726882426.73147: done queuing things up, now waiting for results queue to drain 13531 1726882426.73148: waiting for pending results... 13531 1726882426.74063: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13531 1726882426.74297: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000032 13531 1726882426.74315: variable 'ansible_search_path' from source: unknown 13531 1726882426.74434: variable 'ansible_search_path' from source: unknown 13531 1726882426.74474: calling self._execute() 13531 1726882426.74677: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882426.74686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882426.74691: variable 'omit' from source: magic vars 13531 1726882426.75528: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.75546: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882426.75827: variable 'network_provider' from source: set_fact 13531 1726882426.75960: variable 'network_state' from source: role '' defaults 13531 1726882426.75978: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13531 1726882426.75991: variable 'omit' from source: magic vars 13531 1726882426.76052: variable 'omit' from source: magic vars 13531 1726882426.76202: variable 'network_service_name' from source: role '' defaults 13531 1726882426.76277: variable 'network_service_name' from source: role '' defaults 13531 1726882426.76403: variable '__network_provider_setup' from source: role '' defaults 13531 1726882426.76416: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882426.76482: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882426.76498: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882426.76569: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882426.76805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882426.79541: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882426.80296: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882426.80354: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882426.80396: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882426.80432: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882426.80514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.80554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.80589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.80640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.80661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.80713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.80742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.80778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.80822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.80842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.81089: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13531 1726882426.81214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.81243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.81275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.81323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.81342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.81443: variable 'ansible_python' from source: facts 13531 1726882426.81474: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13531 1726882426.81569: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882426.81659: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882426.81795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.81827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.81860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.81909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.81928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.81984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882426.82021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882426.82070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.82113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882426.82133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882426.82404: variable 'network_connections' from source: task vars 13531 1726882426.82419: variable 'controller_profile' from source: play vars 13531 1726882426.82497: variable 'controller_profile' from source: play vars 13531 1726882426.82512: variable 'controller_device' from source: play vars 13531 1726882426.82590: variable 'controller_device' from source: play vars 13531 1726882426.82610: variable 'port1_profile' from source: play vars 13531 1726882426.82689: variable 'port1_profile' from source: play vars 13531 1726882426.82705: variable 'dhcp_interface1' from source: play vars 13531 1726882426.82790: variable 'dhcp_interface1' from source: play vars 13531 1726882426.82805: variable 'controller_profile' from source: play vars 13531 1726882426.82884: variable 'controller_profile' from source: play vars 13531 1726882426.82902: variable 'port2_profile' from source: play vars 13531 1726882426.82983: variable 'port2_profile' from source: play vars 13531 1726882426.82999: variable 'dhcp_interface2' from source: play vars 13531 1726882426.83077: variable 'dhcp_interface2' from source: play vars 13531 1726882426.83094: variable 'controller_profile' from source: play vars 13531 1726882426.83169: variable 'controller_profile' from source: play vars 13531 1726882426.83519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882426.83927: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882426.83988: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882426.84096: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882426.84203: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882426.84298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882426.84334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882426.84376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882426.84419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882426.84477: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882426.84798: variable 'network_connections' from source: task vars 13531 1726882426.84811: variable 'controller_profile' from source: play vars 13531 1726882426.84895: variable 'controller_profile' from source: play vars 13531 1726882426.84912: variable 'controller_device' from source: play vars 13531 1726882426.85007: variable 'controller_device' from source: play vars 13531 1726882426.85025: variable 'port1_profile' from source: play vars 13531 1726882426.85105: variable 'port1_profile' from source: play vars 13531 1726882426.85120: variable 'dhcp_interface1' from source: play vars 13531 1726882426.85197: variable 'dhcp_interface1' from source: play vars 13531 1726882426.85214: variable 'controller_profile' from source: play vars 13531 1726882426.85308: variable 'controller_profile' from source: play vars 13531 1726882426.85326: variable 'port2_profile' from source: play vars 13531 1726882426.85404: variable 'port2_profile' from source: play vars 13531 1726882426.85420: variable 'dhcp_interface2' from source: play vars 13531 1726882426.85499: variable 'dhcp_interface2' from source: play vars 13531 1726882426.85514: variable 'controller_profile' from source: play vars 13531 1726882426.85591: variable 'controller_profile' from source: play vars 13531 1726882426.85658: variable '__network_packages_default_wireless' from source: role '' defaults 13531 1726882426.85747: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882426.86093: variable 'network_connections' from source: task vars 13531 1726882426.86109: variable 'controller_profile' from source: play vars 13531 1726882426.86459: variable 'controller_profile' from source: play vars 13531 1726882426.86479: variable 'controller_device' from source: play vars 13531 1726882426.86562: variable 'controller_device' from source: play vars 13531 1726882426.86580: variable 'port1_profile' from source: play vars 13531 1726882426.86656: variable 'port1_profile' from source: play vars 13531 1726882426.86674: variable 'dhcp_interface1' from source: play vars 13531 1726882426.86810: variable 'dhcp_interface1' from source: play vars 13531 1726882426.86852: variable 'controller_profile' from source: play vars 13531 1726882426.86960: variable 'controller_profile' from source: play vars 13531 1726882426.86980: variable 'port2_profile' from source: play vars 13531 1726882426.87055: variable 'port2_profile' from source: play vars 13531 1726882426.87073: variable 'dhcp_interface2' from source: play vars 13531 1726882426.87147: variable 'dhcp_interface2' from source: play vars 13531 1726882426.87161: variable 'controller_profile' from source: play vars 13531 1726882426.87234: variable 'controller_profile' from source: play vars 13531 1726882426.87273: variable '__network_packages_default_team' from source: role '' defaults 13531 1726882426.87353: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882426.87844: variable 'network_connections' from source: task vars 13531 1726882426.87853: variable 'controller_profile' from source: play vars 13531 1726882426.87929: variable 'controller_profile' from source: play vars 13531 1726882426.87940: variable 'controller_device' from source: play vars 13531 1726882426.88011: variable 'controller_device' from source: play vars 13531 1726882426.88028: variable 'port1_profile' from source: play vars 13531 1726882426.88099: variable 'port1_profile' from source: play vars 13531 1726882426.88110: variable 'dhcp_interface1' from source: play vars 13531 1726882426.88186: variable 'dhcp_interface1' from source: play vars 13531 1726882426.88196: variable 'controller_profile' from source: play vars 13531 1726882426.88293: variable 'controller_profile' from source: play vars 13531 1726882426.88304: variable 'port2_profile' from source: play vars 13531 1726882426.88389: variable 'port2_profile' from source: play vars 13531 1726882426.88400: variable 'dhcp_interface2' from source: play vars 13531 1726882426.88483: variable 'dhcp_interface2' from source: play vars 13531 1726882426.88496: variable 'controller_profile' from source: play vars 13531 1726882426.88573: variable 'controller_profile' from source: play vars 13531 1726882426.88647: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882426.89563: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882426.89582: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882426.89656: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882426.90092: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13531 1726882426.90583: variable 'network_connections' from source: task vars 13531 1726882426.90667: variable 'controller_profile' from source: play vars 13531 1726882426.90729: variable 'controller_profile' from source: play vars 13531 1726882426.90742: variable 'controller_device' from source: play vars 13531 1726882426.90806: variable 'controller_device' from source: play vars 13531 1726882426.90820: variable 'port1_profile' from source: play vars 13531 1726882426.90881: variable 'port1_profile' from source: play vars 13531 1726882426.90893: variable 'dhcp_interface1' from source: play vars 13531 1726882426.90954: variable 'dhcp_interface1' from source: play vars 13531 1726882426.91012: variable 'controller_profile' from source: play vars 13531 1726882426.91075: variable 'controller_profile' from source: play vars 13531 1726882426.91087: variable 'port2_profile' from source: play vars 13531 1726882426.91145: variable 'port2_profile' from source: play vars 13531 1726882426.91156: variable 'dhcp_interface2' from source: play vars 13531 1726882426.91218: variable 'dhcp_interface2' from source: play vars 13531 1726882426.91229: variable 'controller_profile' from source: play vars 13531 1726882426.91290: variable 'controller_profile' from source: play vars 13531 1726882426.91304: variable 'ansible_distribution' from source: facts 13531 1726882426.91313: variable '__network_rh_distros' from source: role '' defaults 13531 1726882426.91323: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.91355: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13531 1726882426.91532: variable 'ansible_distribution' from source: facts 13531 1726882426.91774: variable '__network_rh_distros' from source: role '' defaults 13531 1726882426.91802: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.91821: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13531 1726882426.92085: variable 'ansible_distribution' from source: facts 13531 1726882426.92098: variable '__network_rh_distros' from source: role '' defaults 13531 1726882426.92108: variable 'ansible_distribution_major_version' from source: facts 13531 1726882426.92150: variable 'network_provider' from source: set_fact 13531 1726882426.92190: variable 'omit' from source: magic vars 13531 1726882426.92242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882426.92276: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882426.92318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882426.92341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882426.92356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882426.92418: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882426.92429: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882426.92437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882426.92544: Set connection var ansible_pipelining to False 13531 1726882426.92555: Set connection var ansible_timeout to 10 13531 1726882426.92576: Set connection var ansible_shell_executable to /bin/sh 13531 1726882426.92586: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882426.92592: Set connection var ansible_connection to ssh 13531 1726882426.92598: Set connection var ansible_shell_type to sh 13531 1726882426.92630: variable 'ansible_shell_executable' from source: unknown 13531 1726882426.92639: variable 'ansible_connection' from source: unknown 13531 1726882426.92649: variable 'ansible_module_compression' from source: unknown 13531 1726882426.92655: variable 'ansible_shell_type' from source: unknown 13531 1726882426.92663: variable 'ansible_shell_executable' from source: unknown 13531 1726882426.92671: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882426.92678: variable 'ansible_pipelining' from source: unknown 13531 1726882426.92684: variable 'ansible_timeout' from source: unknown 13531 1726882426.92692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882426.92802: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882426.92817: variable 'omit' from source: magic vars 13531 1726882426.92827: starting attempt loop 13531 1726882426.92836: running the handler 13531 1726882426.92921: variable 'ansible_facts' from source: unknown 13531 1726882426.93795: _low_level_execute_command(): starting 13531 1726882426.93806: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882426.94643: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882426.94660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882426.94679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882426.94702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882426.94749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882426.94762: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882426.94779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882426.94796: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882426.94812: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882426.94824: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882426.94839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882426.94852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882426.94870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882426.94883: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882426.94893: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882426.94906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882426.94990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882426.95012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882426.95031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882426.95182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882426.96904: stdout chunk (state=3): >>>/root <<< 13531 1726882426.97097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882426.97100: stdout chunk (state=3): >>><<< 13531 1726882426.97107: stderr chunk (state=3): >>><<< 13531 1726882426.97222: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882426.97225: _low_level_execute_command(): starting 13531 1726882426.97229: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882426.9713032-14216-264273426104262 `" && echo ansible-tmp-1726882426.9713032-14216-264273426104262="` echo /root/.ansible/tmp/ansible-tmp-1726882426.9713032-14216-264273426104262 `" ) && sleep 0' 13531 1726882426.97827: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882426.97841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882426.97857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882426.97879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882426.97928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882426.97941: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882426.97956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882426.97976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882426.97988: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882426.98005: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882426.98018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882426.98031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882426.98047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882426.98060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882426.98075: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882426.98089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882426.98173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882426.98196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882426.98212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882426.98354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882427.00233: stdout chunk (state=3): >>>ansible-tmp-1726882426.9713032-14216-264273426104262=/root/.ansible/tmp/ansible-tmp-1726882426.9713032-14216-264273426104262 <<< 13531 1726882427.00380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882427.00445: stderr chunk (state=3): >>><<< 13531 1726882427.00449: stdout chunk (state=3): >>><<< 13531 1726882427.00769: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882426.9713032-14216-264273426104262=/root/.ansible/tmp/ansible-tmp-1726882426.9713032-14216-264273426104262 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882427.00773: variable 'ansible_module_compression' from source: unknown 13531 1726882427.00777: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 13531 1726882427.00779: ANSIBALLZ: Acquiring lock 13531 1726882427.00782: ANSIBALLZ: Lock acquired: 139969312288320 13531 1726882427.00784: ANSIBALLZ: Creating module 13531 1726882427.47298: ANSIBALLZ: Writing module into payload 13531 1726882427.47734: ANSIBALLZ: Writing module 13531 1726882427.47779: ANSIBALLZ: Renaming module 13531 1726882427.47791: ANSIBALLZ: Done creating module 13531 1726882427.47819: variable 'ansible_facts' from source: unknown 13531 1726882427.48084: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882426.9713032-14216-264273426104262/AnsiballZ_systemd.py 13531 1726882427.48829: Sending initial data 13531 1726882427.48833: Sent initial data (156 bytes) 13531 1726882427.51054: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882427.51075: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882427.51091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882427.51111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882427.51269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882427.51283: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882427.51297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882427.51314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882427.51325: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882427.51338: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882427.51351: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882427.51367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882427.51383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882427.51396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882427.51412: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882427.51425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882427.51506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882427.51579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882427.51596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882427.51738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882427.53597: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882427.53695: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882427.53794: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpy6eqttfi /root/.ansible/tmp/ansible-tmp-1726882426.9713032-14216-264273426104262/AnsiballZ_systemd.py <<< 13531 1726882427.53894: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882427.57372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882427.57531: stderr chunk (state=3): >>><<< 13531 1726882427.57534: stdout chunk (state=3): >>><<< 13531 1726882427.57537: done transferring module to remote 13531 1726882427.57539: _low_level_execute_command(): starting 13531 1726882427.57542: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882426.9713032-14216-264273426104262/ /root/.ansible/tmp/ansible-tmp-1726882426.9713032-14216-264273426104262/AnsiballZ_systemd.py && sleep 0' 13531 1726882427.58142: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882427.58158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882427.58185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882427.58205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882427.58247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882427.58260: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882427.58278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882427.58302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882427.58314: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882427.58325: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882427.58337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882427.58351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882427.58369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882427.58383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882427.58395: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882427.58414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882427.58490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882427.58520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882427.58538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882427.58676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882427.60558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882427.60587: stderr chunk (state=3): >>><<< 13531 1726882427.60590: stdout chunk (state=3): >>><<< 13531 1726882427.60604: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882427.60607: _low_level_execute_command(): starting 13531 1726882427.60612: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882426.9713032-14216-264273426104262/AnsiballZ_systemd.py && sleep 0' 13531 1726882427.61119: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882427.61126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882427.61135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882427.61145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882427.61176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882427.61181: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882427.61190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882427.61199: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882427.61204: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882427.61208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882427.61224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882427.61228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882427.61277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882427.61302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882427.61305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882427.61415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882427.86708: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 13531 1726882427.86766: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "8916992", "MemoryAvailable": "infinity", "CPUUsageNSec": "775382000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13531 1726882427.88417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882427.88448: stderr chunk (state=3): >>><<< 13531 1726882427.88451: stdout chunk (state=3): >>><<< 13531 1726882427.88907: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "8916992", "MemoryAvailable": "infinity", "CPUUsageNSec": "775382000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882427.88916: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882426.9713032-14216-264273426104262/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882427.88918: _low_level_execute_command(): starting 13531 1726882427.88921: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882426.9713032-14216-264273426104262/ > /dev/null 2>&1 && sleep 0' 13531 1726882427.89635: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882427.89672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882427.89688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882427.89717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882427.89762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882427.89776: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882427.89789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882427.89805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882427.89815: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882427.89825: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882427.89835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882427.89848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882427.89871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882427.89884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882427.89894: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882427.89906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882427.89985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882427.90007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882427.90021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882427.90156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882427.92089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882427.92160: stderr chunk (state=3): >>><<< 13531 1726882427.92165: stdout chunk (state=3): >>><<< 13531 1726882427.92270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882427.92273: handler run complete 13531 1726882427.92276: attempt loop complete, returning result 13531 1726882427.92278: _execute() done 13531 1726882427.92280: dumping result to json 13531 1726882427.92476: done dumping result, returning 13531 1726882427.92479: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4fd9-519d-000000000032] 13531 1726882427.92481: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000032 13531 1726882427.92601: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000032 13531 1726882427.92604: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882427.92651: no more pending results, returning what we have 13531 1726882427.92654: results queue empty 13531 1726882427.92655: checking for any_errors_fatal 13531 1726882427.92662: done checking for any_errors_fatal 13531 1726882427.92665: checking for max_fail_percentage 13531 1726882427.92667: done checking for max_fail_percentage 13531 1726882427.92667: checking to see if all hosts have failed and the running result is not ok 13531 1726882427.92668: done checking to see if all hosts have failed 13531 1726882427.92669: getting the remaining hosts for this loop 13531 1726882427.92670: done getting the remaining hosts for this loop 13531 1726882427.92674: getting the next task for host managed_node2 13531 1726882427.92679: done getting next task for host managed_node2 13531 1726882427.92684: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13531 1726882427.92687: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882427.92696: getting variables 13531 1726882427.92698: in VariableManager get_vars() 13531 1726882427.92744: Calling all_inventory to load vars for managed_node2 13531 1726882427.92746: Calling groups_inventory to load vars for managed_node2 13531 1726882427.92748: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882427.92758: Calling all_plugins_play to load vars for managed_node2 13531 1726882427.92761: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882427.92766: Calling groups_plugins_play to load vars for managed_node2 13531 1726882427.94493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882427.96813: done with get_vars() 13531 1726882427.96843: done getting variables 13531 1726882427.97019: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:47 -0400 (0:00:01.247) 0:00:15.865 ****** 13531 1726882427.97056: entering _queue_task() for managed_node2/service 13531 1726882427.97422: worker is 1 (out of 1 available) 13531 1726882427.97434: exiting _queue_task() for managed_node2/service 13531 1726882427.97446: done queuing things up, now waiting for results queue to drain 13531 1726882427.97448: waiting for pending results... 13531 1726882427.97747: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13531 1726882427.97906: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000033 13531 1726882427.97925: variable 'ansible_search_path' from source: unknown 13531 1726882427.97931: variable 'ansible_search_path' from source: unknown 13531 1726882427.97978: calling self._execute() 13531 1726882427.98070: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882427.98087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882427.98106: variable 'omit' from source: magic vars 13531 1726882427.98489: variable 'ansible_distribution_major_version' from source: facts 13531 1726882427.98508: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882427.98642: variable 'network_provider' from source: set_fact 13531 1726882427.98653: Evaluated conditional (network_provider == "nm"): True 13531 1726882427.98759: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882427.98849: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882427.99036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882428.02239: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882428.03010: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882428.03199: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882428.03203: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882428.03205: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882428.03673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882428.03707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882428.03735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882428.03789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882428.03808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882428.03862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882428.03892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882428.03919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882428.03967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882428.03988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882428.04031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882428.04057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882428.04093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882428.04134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882428.04151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882428.04440: variable 'network_connections' from source: task vars 13531 1726882428.04457: variable 'controller_profile' from source: play vars 13531 1726882428.04559: variable 'controller_profile' from source: play vars 13531 1726882428.04596: variable 'controller_device' from source: play vars 13531 1726882428.04672: variable 'controller_device' from source: play vars 13531 1726882428.04710: variable 'port1_profile' from source: play vars 13531 1726882428.04776: variable 'port1_profile' from source: play vars 13531 1726882428.04788: variable 'dhcp_interface1' from source: play vars 13531 1726882428.04853: variable 'dhcp_interface1' from source: play vars 13531 1726882428.04866: variable 'controller_profile' from source: play vars 13531 1726882428.04922: variable 'controller_profile' from source: play vars 13531 1726882428.04934: variable 'port2_profile' from source: play vars 13531 1726882428.05002: variable 'port2_profile' from source: play vars 13531 1726882428.05015: variable 'dhcp_interface2' from source: play vars 13531 1726882428.05084: variable 'dhcp_interface2' from source: play vars 13531 1726882428.05095: variable 'controller_profile' from source: play vars 13531 1726882428.05157: variable 'controller_profile' from source: play vars 13531 1726882428.05236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882428.05420: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882428.05461: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882428.05504: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882428.05566: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882428.05620: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882428.05645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882428.05675: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882428.05721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882428.05781: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882428.06058: variable 'network_connections' from source: task vars 13531 1726882428.06070: variable 'controller_profile' from source: play vars 13531 1726882428.06130: variable 'controller_profile' from source: play vars 13531 1726882428.06149: variable 'controller_device' from source: play vars 13531 1726882428.06211: variable 'controller_device' from source: play vars 13531 1726882428.06223: variable 'port1_profile' from source: play vars 13531 1726882428.06293: variable 'port1_profile' from source: play vars 13531 1726882428.06304: variable 'dhcp_interface1' from source: play vars 13531 1726882428.06373: variable 'dhcp_interface1' from source: play vars 13531 1726882428.06384: variable 'controller_profile' from source: play vars 13531 1726882428.06444: variable 'controller_profile' from source: play vars 13531 1726882428.06455: variable 'port2_profile' from source: play vars 13531 1726882428.06523: variable 'port2_profile' from source: play vars 13531 1726882428.06534: variable 'dhcp_interface2' from source: play vars 13531 1726882428.06603: variable 'dhcp_interface2' from source: play vars 13531 1726882428.06615: variable 'controller_profile' from source: play vars 13531 1726882428.06678: variable 'controller_profile' from source: play vars 13531 1726882428.06729: Evaluated conditional (__network_wpa_supplicant_required): False 13531 1726882428.06737: when evaluation is False, skipping this task 13531 1726882428.06743: _execute() done 13531 1726882428.06749: dumping result to json 13531 1726882428.06754: done dumping result, returning 13531 1726882428.06767: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4fd9-519d-000000000033] 13531 1726882428.06777: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000033 13531 1726882428.06886: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000033 13531 1726882428.06894: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13531 1726882428.06945: no more pending results, returning what we have 13531 1726882428.06949: results queue empty 13531 1726882428.06950: checking for any_errors_fatal 13531 1726882428.06973: done checking for any_errors_fatal 13531 1726882428.06974: checking for max_fail_percentage 13531 1726882428.06977: done checking for max_fail_percentage 13531 1726882428.06978: checking to see if all hosts have failed and the running result is not ok 13531 1726882428.06979: done checking to see if all hosts have failed 13531 1726882428.06979: getting the remaining hosts for this loop 13531 1726882428.06981: done getting the remaining hosts for this loop 13531 1726882428.06985: getting the next task for host managed_node2 13531 1726882428.06992: done getting next task for host managed_node2 13531 1726882428.06997: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13531 1726882428.07000: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882428.07015: getting variables 13531 1726882428.07017: in VariableManager get_vars() 13531 1726882428.07081: Calling all_inventory to load vars for managed_node2 13531 1726882428.07084: Calling groups_inventory to load vars for managed_node2 13531 1726882428.07087: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882428.07098: Calling all_plugins_play to load vars for managed_node2 13531 1726882428.07101: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882428.07105: Calling groups_plugins_play to load vars for managed_node2 13531 1726882428.08932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882428.11658: done with get_vars() 13531 1726882428.11685: done getting variables 13531 1726882428.11750: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:48 -0400 (0:00:00.147) 0:00:16.013 ****** 13531 1726882428.11783: entering _queue_task() for managed_node2/service 13531 1726882428.12114: worker is 1 (out of 1 available) 13531 1726882428.12126: exiting _queue_task() for managed_node2/service 13531 1726882428.12143: done queuing things up, now waiting for results queue to drain 13531 1726882428.12145: waiting for pending results... 13531 1726882428.12434: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 13531 1726882428.12579: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000034 13531 1726882428.12606: variable 'ansible_search_path' from source: unknown 13531 1726882428.12614: variable 'ansible_search_path' from source: unknown 13531 1726882428.12655: calling self._execute() 13531 1726882428.12754: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882428.12768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882428.12782: variable 'omit' from source: magic vars 13531 1726882428.13171: variable 'ansible_distribution_major_version' from source: facts 13531 1726882428.13189: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882428.13382: variable 'network_provider' from source: set_fact 13531 1726882428.13409: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882428.13420: when evaluation is False, skipping this task 13531 1726882428.13428: _execute() done 13531 1726882428.13434: dumping result to json 13531 1726882428.13441: done dumping result, returning 13531 1726882428.13476: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4fd9-519d-000000000034] 13531 1726882428.13502: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000034 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882428.13753: no more pending results, returning what we have 13531 1726882428.13765: results queue empty 13531 1726882428.13766: checking for any_errors_fatal 13531 1726882428.13774: done checking for any_errors_fatal 13531 1726882428.13775: checking for max_fail_percentage 13531 1726882428.13777: done checking for max_fail_percentage 13531 1726882428.13785: checking to see if all hosts have failed and the running result is not ok 13531 1726882428.13786: done checking to see if all hosts have failed 13531 1726882428.13787: getting the remaining hosts for this loop 13531 1726882428.13788: done getting the remaining hosts for this loop 13531 1726882428.13793: getting the next task for host managed_node2 13531 1726882428.13800: done getting next task for host managed_node2 13531 1726882428.13817: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13531 1726882428.13822: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882428.13846: getting variables 13531 1726882428.13849: in VariableManager get_vars() 13531 1726882428.13952: Calling all_inventory to load vars for managed_node2 13531 1726882428.13955: Calling groups_inventory to load vars for managed_node2 13531 1726882428.13967: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882428.13981: Calling all_plugins_play to load vars for managed_node2 13531 1726882428.13985: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882428.13988: Calling groups_plugins_play to load vars for managed_node2 13531 1726882428.15171: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000034 13531 1726882428.15175: WORKER PROCESS EXITING 13531 1726882428.16375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882428.18416: done with get_vars() 13531 1726882428.18448: done getting variables 13531 1726882428.18509: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:48 -0400 (0:00:00.067) 0:00:16.080 ****** 13531 1726882428.18543: entering _queue_task() for managed_node2/copy 13531 1726882428.18855: worker is 1 (out of 1 available) 13531 1726882428.18868: exiting _queue_task() for managed_node2/copy 13531 1726882428.18880: done queuing things up, now waiting for results queue to drain 13531 1726882428.18881: waiting for pending results... 13531 1726882428.19165: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13531 1726882428.19301: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000035 13531 1726882428.19323: variable 'ansible_search_path' from source: unknown 13531 1726882428.19331: variable 'ansible_search_path' from source: unknown 13531 1726882428.19373: calling self._execute() 13531 1726882428.19473: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882428.19483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882428.19496: variable 'omit' from source: magic vars 13531 1726882428.19855: variable 'ansible_distribution_major_version' from source: facts 13531 1726882428.19880: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882428.19997: variable 'network_provider' from source: set_fact 13531 1726882428.20008: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882428.20015: when evaluation is False, skipping this task 13531 1726882428.20041: _execute() done 13531 1726882428.20061: dumping result to json 13531 1726882428.20084: done dumping result, returning 13531 1726882428.20111: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4fd9-519d-000000000035] 13531 1726882428.20125: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000035 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13531 1726882428.20348: no more pending results, returning what we have 13531 1726882428.20361: results queue empty 13531 1726882428.20362: checking for any_errors_fatal 13531 1726882428.20369: done checking for any_errors_fatal 13531 1726882428.20370: checking for max_fail_percentage 13531 1726882428.20372: done checking for max_fail_percentage 13531 1726882428.20373: checking to see if all hosts have failed and the running result is not ok 13531 1726882428.20374: done checking to see if all hosts have failed 13531 1726882428.20375: getting the remaining hosts for this loop 13531 1726882428.20376: done getting the remaining hosts for this loop 13531 1726882428.20380: getting the next task for host managed_node2 13531 1726882428.20394: done getting next task for host managed_node2 13531 1726882428.20399: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13531 1726882428.20410: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882428.20433: getting variables 13531 1726882428.20435: in VariableManager get_vars() 13531 1726882428.20519: Calling all_inventory to load vars for managed_node2 13531 1726882428.20522: Calling groups_inventory to load vars for managed_node2 13531 1726882428.20524: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882428.20536: Calling all_plugins_play to load vars for managed_node2 13531 1726882428.20538: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882428.20541: Calling groups_plugins_play to load vars for managed_node2 13531 1726882428.21551: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000035 13531 1726882428.21555: WORKER PROCESS EXITING 13531 1726882428.23019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882428.24932: done with get_vars() 13531 1726882428.24971: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:48 -0400 (0:00:00.065) 0:00:16.146 ****** 13531 1726882428.25098: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 13531 1726882428.25101: Creating lock for fedora.linux_system_roles.network_connections 13531 1726882428.25484: worker is 1 (out of 1 available) 13531 1726882428.25495: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 13531 1726882428.25508: done queuing things up, now waiting for results queue to drain 13531 1726882428.25510: waiting for pending results... 13531 1726882428.25825: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13531 1726882428.26002: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000036 13531 1726882428.26031: variable 'ansible_search_path' from source: unknown 13531 1726882428.26047: variable 'ansible_search_path' from source: unknown 13531 1726882428.26108: calling self._execute() 13531 1726882428.26219: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882428.26232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882428.26245: variable 'omit' from source: magic vars 13531 1726882428.26712: variable 'ansible_distribution_major_version' from source: facts 13531 1726882428.26742: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882428.26762: variable 'omit' from source: magic vars 13531 1726882428.26845: variable 'omit' from source: magic vars 13531 1726882428.27070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882428.29923: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882428.30000: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882428.30045: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882428.30086: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882428.30139: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882428.30257: variable 'network_provider' from source: set_fact 13531 1726882428.30467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882428.30543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882428.30595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882428.30649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882428.30672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882428.30792: variable 'omit' from source: magic vars 13531 1726882428.30915: variable 'omit' from source: magic vars 13531 1726882428.31066: variable 'network_connections' from source: task vars 13531 1726882428.31085: variable 'controller_profile' from source: play vars 13531 1726882428.31160: variable 'controller_profile' from source: play vars 13531 1726882428.31176: variable 'controller_device' from source: play vars 13531 1726882428.31252: variable 'controller_device' from source: play vars 13531 1726882428.31276: variable 'port1_profile' from source: play vars 13531 1726882428.31356: variable 'port1_profile' from source: play vars 13531 1726882428.31370: variable 'dhcp_interface1' from source: play vars 13531 1726882428.31458: variable 'dhcp_interface1' from source: play vars 13531 1726882428.31474: variable 'controller_profile' from source: play vars 13531 1726882428.31534: variable 'controller_profile' from source: play vars 13531 1726882428.31551: variable 'port2_profile' from source: play vars 13531 1726882428.31624: variable 'port2_profile' from source: play vars 13531 1726882428.31636: variable 'dhcp_interface2' from source: play vars 13531 1726882428.31704: variable 'dhcp_interface2' from source: play vars 13531 1726882428.31716: variable 'controller_profile' from source: play vars 13531 1726882428.31782: variable 'controller_profile' from source: play vars 13531 1726882428.32023: variable 'omit' from source: magic vars 13531 1726882428.32037: variable '__lsr_ansible_managed' from source: task vars 13531 1726882428.32112: variable '__lsr_ansible_managed' from source: task vars 13531 1726882428.32332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13531 1726882428.32684: Loaded config def from plugin (lookup/template) 13531 1726882428.32694: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13531 1726882428.32725: File lookup term: get_ansible_managed.j2 13531 1726882428.32732: variable 'ansible_search_path' from source: unknown 13531 1726882428.32746: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13531 1726882428.32766: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13531 1726882428.32788: variable 'ansible_search_path' from source: unknown 13531 1726882428.40023: variable 'ansible_managed' from source: unknown 13531 1726882428.40179: variable 'omit' from source: magic vars 13531 1726882428.40244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882428.40278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882428.40306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882428.40328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882428.40343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882428.40379: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882428.40388: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882428.40395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882428.40532: Set connection var ansible_pipelining to False 13531 1726882428.40543: Set connection var ansible_timeout to 10 13531 1726882428.40552: Set connection var ansible_shell_executable to /bin/sh 13531 1726882428.40562: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882428.40596: Set connection var ansible_connection to ssh 13531 1726882428.40604: Set connection var ansible_shell_type to sh 13531 1726882428.40640: variable 'ansible_shell_executable' from source: unknown 13531 1726882428.40647: variable 'ansible_connection' from source: unknown 13531 1726882428.40654: variable 'ansible_module_compression' from source: unknown 13531 1726882428.40665: variable 'ansible_shell_type' from source: unknown 13531 1726882428.40673: variable 'ansible_shell_executable' from source: unknown 13531 1726882428.40680: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882428.40687: variable 'ansible_pipelining' from source: unknown 13531 1726882428.40693: variable 'ansible_timeout' from source: unknown 13531 1726882428.40701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882428.40845: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882428.40862: variable 'omit' from source: magic vars 13531 1726882428.40877: starting attempt loop 13531 1726882428.40884: running the handler 13531 1726882428.40903: _low_level_execute_command(): starting 13531 1726882428.40914: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882428.41728: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882428.41752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882428.41770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882428.41800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882428.41872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882428.41885: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882428.41899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882428.41918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882428.41931: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882428.41946: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882428.41958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882428.41975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882428.41991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882428.42003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882428.42014: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882428.42029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882428.42145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882428.42167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882428.42184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882428.42328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882428.43984: stdout chunk (state=3): >>>/root <<< 13531 1726882428.44088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882428.44169: stderr chunk (state=3): >>><<< 13531 1726882428.44172: stdout chunk (state=3): >>><<< 13531 1726882428.44282: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882428.44285: _low_level_execute_command(): starting 13531 1726882428.44289: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882428.4419184-14267-236024954364651 `" && echo ansible-tmp-1726882428.4419184-14267-236024954364651="` echo /root/.ansible/tmp/ansible-tmp-1726882428.4419184-14267-236024954364651 `" ) && sleep 0' 13531 1726882428.44874: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882428.44889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882428.44906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882428.44953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882428.45084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882428.45097: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882428.45111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882428.45129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882428.45141: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882428.45152: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882428.45168: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882428.45185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882428.45200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882428.45212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882428.45223: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882428.45236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882428.45315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882428.45331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882428.45344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882428.45511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882428.47393: stdout chunk (state=3): >>>ansible-tmp-1726882428.4419184-14267-236024954364651=/root/.ansible/tmp/ansible-tmp-1726882428.4419184-14267-236024954364651 <<< 13531 1726882428.47592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882428.47600: stdout chunk (state=3): >>><<< 13531 1726882428.47603: stderr chunk (state=3): >>><<< 13531 1726882428.48072: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882428.4419184-14267-236024954364651=/root/.ansible/tmp/ansible-tmp-1726882428.4419184-14267-236024954364651 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882428.48076: variable 'ansible_module_compression' from source: unknown 13531 1726882428.48078: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 13531 1726882428.48080: ANSIBALLZ: Acquiring lock 13531 1726882428.48082: ANSIBALLZ: Lock acquired: 139969310553104 13531 1726882428.48084: ANSIBALLZ: Creating module 13531 1726882428.74355: ANSIBALLZ: Writing module into payload 13531 1726882428.74703: ANSIBALLZ: Writing module 13531 1726882428.74726: ANSIBALLZ: Renaming module 13531 1726882428.74730: ANSIBALLZ: Done creating module 13531 1726882428.74753: variable 'ansible_facts' from source: unknown 13531 1726882428.74820: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882428.4419184-14267-236024954364651/AnsiballZ_network_connections.py 13531 1726882428.74932: Sending initial data 13531 1726882428.74935: Sent initial data (168 bytes) 13531 1726882428.75918: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882428.75922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882428.75934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882428.75947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882428.75994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882428.76026: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882428.76029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882428.76032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882428.76034: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882428.76037: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882428.76043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882428.76052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882428.76068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882428.76214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882428.76217: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882428.76223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882428.76225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882428.76227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882428.76229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882428.76382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882428.78194: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882428.78291: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882428.78390: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmphcl4piee /root/.ansible/tmp/ansible-tmp-1726882428.4419184-14267-236024954364651/AnsiballZ_network_connections.py <<< 13531 1726882428.78487: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882428.80871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882428.80960: stderr chunk (state=3): >>><<< 13531 1726882428.80966: stdout chunk (state=3): >>><<< 13531 1726882428.80988: done transferring module to remote 13531 1726882428.80999: _low_level_execute_command(): starting 13531 1726882428.81002: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882428.4419184-14267-236024954364651/ /root/.ansible/tmp/ansible-tmp-1726882428.4419184-14267-236024954364651/AnsiballZ_network_connections.py && sleep 0' 13531 1726882428.81765: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882428.81781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882428.81796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882428.81820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882428.81866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882428.81880: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882428.81895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882428.81913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882428.81930: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882428.81945: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882428.81958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882428.81974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882428.81990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882428.82001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882428.82010: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882428.82022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882428.82109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882428.82131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882428.82150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882428.82294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882428.84203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882428.84208: stdout chunk (state=3): >>><<< 13531 1726882428.84210: stderr chunk (state=3): >>><<< 13531 1726882428.84269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882428.84273: _low_level_execute_command(): starting 13531 1726882428.84276: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882428.4419184-14267-236024954364651/AnsiballZ_network_connections.py && sleep 0' 13531 1726882428.84967: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882428.84987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882428.85004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882428.85024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882428.85069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882428.85082: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882428.85107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882428.85125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882428.85137: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882428.85151: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882428.85163: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882428.85180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882428.85196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882428.85209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882428.85230: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882428.85243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882428.85330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882428.85352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882428.85370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882428.85509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882429.26041: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, ed834b6c-cbda-46ef-ae08-dad7f6819810\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, ed834b6c-cbda-46ef-ae08-dad7f6819810 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13531 1726882429.28627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882429.28686: stderr chunk (state=3): >>><<< 13531 1726882429.28690: stdout chunk (state=3): >>><<< 13531 1726882429.28741: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, ed834b6c-cbda-46ef-ae08-dad7f6819810\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, ed834b6c-cbda-46ef-ae08-dad7f6819810 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882429.28776: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882428.4419184-14267-236024954364651/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882429.28791: _low_level_execute_command(): starting 13531 1726882429.28794: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882428.4419184-14267-236024954364651/ > /dev/null 2>&1 && sleep 0' 13531 1726882429.29376: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882429.29380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882429.29409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882429.29412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882429.29415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882429.29460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882429.29470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882429.29588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882429.31448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882429.31497: stderr chunk (state=3): >>><<< 13531 1726882429.31500: stdout chunk (state=3): >>><<< 13531 1726882429.31513: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882429.31519: handler run complete 13531 1726882429.31551: attempt loop complete, returning result 13531 1726882429.31555: _execute() done 13531 1726882429.31557: dumping result to json 13531 1726882429.31565: done dumping result, returning 13531 1726882429.31574: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4fd9-519d-000000000036] 13531 1726882429.31579: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000036 13531 1726882429.31688: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000036 13531 1726882429.31691: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, ed834b6c-cbda-46ef-ae08-dad7f6819810 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, ed834b6c-cbda-46ef-ae08-dad7f6819810 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78 (not-active) 13531 1726882429.31822: no more pending results, returning what we have 13531 1726882429.31828: results queue empty 13531 1726882429.31829: checking for any_errors_fatal 13531 1726882429.31837: done checking for any_errors_fatal 13531 1726882429.31838: checking for max_fail_percentage 13531 1726882429.31840: done checking for max_fail_percentage 13531 1726882429.31841: checking to see if all hosts have failed and the running result is not ok 13531 1726882429.31842: done checking to see if all hosts have failed 13531 1726882429.31845: getting the remaining hosts for this loop 13531 1726882429.31847: done getting the remaining hosts for this loop 13531 1726882429.31851: getting the next task for host managed_node2 13531 1726882429.31859: done getting next task for host managed_node2 13531 1726882429.31867: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13531 1726882429.31876: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882429.31890: getting variables 13531 1726882429.31894: in VariableManager get_vars() 13531 1726882429.31961: Calling all_inventory to load vars for managed_node2 13531 1726882429.31967: Calling groups_inventory to load vars for managed_node2 13531 1726882429.31970: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882429.31980: Calling all_plugins_play to load vars for managed_node2 13531 1726882429.31982: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882429.31985: Calling groups_plugins_play to load vars for managed_node2 13531 1726882429.33348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882429.34973: done with get_vars() 13531 1726882429.34998: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:49 -0400 (0:00:01.099) 0:00:17.246 ****** 13531 1726882429.35089: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 13531 1726882429.35091: Creating lock for fedora.linux_system_roles.network_state 13531 1726882429.35406: worker is 1 (out of 1 available) 13531 1726882429.35420: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 13531 1726882429.35433: done queuing things up, now waiting for results queue to drain 13531 1726882429.35434: waiting for pending results... 13531 1726882429.35720: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 13531 1726882429.35852: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000037 13531 1726882429.35878: variable 'ansible_search_path' from source: unknown 13531 1726882429.35887: variable 'ansible_search_path' from source: unknown 13531 1726882429.35927: calling self._execute() 13531 1726882429.36025: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882429.36028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882429.36038: variable 'omit' from source: magic vars 13531 1726882429.36326: variable 'ansible_distribution_major_version' from source: facts 13531 1726882429.36337: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882429.36420: variable 'network_state' from source: role '' defaults 13531 1726882429.36429: Evaluated conditional (network_state != {}): False 13531 1726882429.36432: when evaluation is False, skipping this task 13531 1726882429.36434: _execute() done 13531 1726882429.36437: dumping result to json 13531 1726882429.36439: done dumping result, returning 13531 1726882429.36446: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4fd9-519d-000000000037] 13531 1726882429.36456: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000037 13531 1726882429.36537: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000037 13531 1726882429.36540: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882429.36591: no more pending results, returning what we have 13531 1726882429.36595: results queue empty 13531 1726882429.36596: checking for any_errors_fatal 13531 1726882429.36609: done checking for any_errors_fatal 13531 1726882429.36609: checking for max_fail_percentage 13531 1726882429.36611: done checking for max_fail_percentage 13531 1726882429.36612: checking to see if all hosts have failed and the running result is not ok 13531 1726882429.36612: done checking to see if all hosts have failed 13531 1726882429.36613: getting the remaining hosts for this loop 13531 1726882429.36614: done getting the remaining hosts for this loop 13531 1726882429.36617: getting the next task for host managed_node2 13531 1726882429.36622: done getting next task for host managed_node2 13531 1726882429.36626: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13531 1726882429.36629: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882429.36642: getting variables 13531 1726882429.36644: in VariableManager get_vars() 13531 1726882429.36693: Calling all_inventory to load vars for managed_node2 13531 1726882429.36696: Calling groups_inventory to load vars for managed_node2 13531 1726882429.36698: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882429.36707: Calling all_plugins_play to load vars for managed_node2 13531 1726882429.36709: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882429.36712: Calling groups_plugins_play to load vars for managed_node2 13531 1726882429.37590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882429.39227: done with get_vars() 13531 1726882429.39257: done getting variables 13531 1726882429.39317: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:49 -0400 (0:00:00.042) 0:00:17.288 ****** 13531 1726882429.39350: entering _queue_task() for managed_node2/debug 13531 1726882429.39671: worker is 1 (out of 1 available) 13531 1726882429.39683: exiting _queue_task() for managed_node2/debug 13531 1726882429.39696: done queuing things up, now waiting for results queue to drain 13531 1726882429.39697: waiting for pending results... 13531 1726882429.40011: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13531 1726882429.40217: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000038 13531 1726882429.40240: variable 'ansible_search_path' from source: unknown 13531 1726882429.40248: variable 'ansible_search_path' from source: unknown 13531 1726882429.40318: calling self._execute() 13531 1726882429.40424: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882429.40436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882429.40449: variable 'omit' from source: magic vars 13531 1726882429.40856: variable 'ansible_distribution_major_version' from source: facts 13531 1726882429.40878: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882429.40891: variable 'omit' from source: magic vars 13531 1726882429.40956: variable 'omit' from source: magic vars 13531 1726882429.41000: variable 'omit' from source: magic vars 13531 1726882429.41052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882429.41095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882429.41120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882429.41145: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882429.41166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882429.41200: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882429.41210: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882429.41218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882429.41334: Set connection var ansible_pipelining to False 13531 1726882429.41345: Set connection var ansible_timeout to 10 13531 1726882429.41363: Set connection var ansible_shell_executable to /bin/sh 13531 1726882429.41376: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882429.41383: Set connection var ansible_connection to ssh 13531 1726882429.41390: Set connection var ansible_shell_type to sh 13531 1726882429.41420: variable 'ansible_shell_executable' from source: unknown 13531 1726882429.41429: variable 'ansible_connection' from source: unknown 13531 1726882429.41436: variable 'ansible_module_compression' from source: unknown 13531 1726882429.41444: variable 'ansible_shell_type' from source: unknown 13531 1726882429.41450: variable 'ansible_shell_executable' from source: unknown 13531 1726882429.41460: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882429.41474: variable 'ansible_pipelining' from source: unknown 13531 1726882429.41482: variable 'ansible_timeout' from source: unknown 13531 1726882429.41489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882429.41636: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882429.41657: variable 'omit' from source: magic vars 13531 1726882429.41670: starting attempt loop 13531 1726882429.41677: running the handler 13531 1726882429.41818: variable '__network_connections_result' from source: set_fact 13531 1726882429.41891: handler run complete 13531 1726882429.41919: attempt loop complete, returning result 13531 1726882429.41927: _execute() done 13531 1726882429.41933: dumping result to json 13531 1726882429.41941: done dumping result, returning 13531 1726882429.41957: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4fd9-519d-000000000038] 13531 1726882429.41971: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000038 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, ed834b6c-cbda-46ef-ae08-dad7f6819810", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, ed834b6c-cbda-46ef-ae08-dad7f6819810 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78 (not-active)" ] } 13531 1726882429.42145: no more pending results, returning what we have 13531 1726882429.42148: results queue empty 13531 1726882429.42149: checking for any_errors_fatal 13531 1726882429.42158: done checking for any_errors_fatal 13531 1726882429.42159: checking for max_fail_percentage 13531 1726882429.42161: done checking for max_fail_percentage 13531 1726882429.42162: checking to see if all hosts have failed and the running result is not ok 13531 1726882429.42162: done checking to see if all hosts have failed 13531 1726882429.42165: getting the remaining hosts for this loop 13531 1726882429.42167: done getting the remaining hosts for this loop 13531 1726882429.42170: getting the next task for host managed_node2 13531 1726882429.42177: done getting next task for host managed_node2 13531 1726882429.42183: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13531 1726882429.42186: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882429.42198: getting variables 13531 1726882429.42200: in VariableManager get_vars() 13531 1726882429.42263: Calling all_inventory to load vars for managed_node2 13531 1726882429.42267: Calling groups_inventory to load vars for managed_node2 13531 1726882429.42270: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882429.42282: Calling all_plugins_play to load vars for managed_node2 13531 1726882429.42285: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882429.42288: Calling groups_plugins_play to load vars for managed_node2 13531 1726882429.43284: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000038 13531 1726882429.43288: WORKER PROCESS EXITING 13531 1726882429.43860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882429.44907: done with get_vars() 13531 1726882429.44925: done getting variables 13531 1726882429.44974: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:49 -0400 (0:00:00.056) 0:00:17.345 ****** 13531 1726882429.45005: entering _queue_task() for managed_node2/debug 13531 1726882429.45250: worker is 1 (out of 1 available) 13531 1726882429.45267: exiting _queue_task() for managed_node2/debug 13531 1726882429.45284: done queuing things up, now waiting for results queue to drain 13531 1726882429.45286: waiting for pending results... 13531 1726882429.45530: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13531 1726882429.45681: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000039 13531 1726882429.45705: variable 'ansible_search_path' from source: unknown 13531 1726882429.45713: variable 'ansible_search_path' from source: unknown 13531 1726882429.45765: calling self._execute() 13531 1726882429.45858: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882429.45876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882429.45889: variable 'omit' from source: magic vars 13531 1726882429.46262: variable 'ansible_distribution_major_version' from source: facts 13531 1726882429.46283: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882429.46295: variable 'omit' from source: magic vars 13531 1726882429.46357: variable 'omit' from source: magic vars 13531 1726882429.46399: variable 'omit' from source: magic vars 13531 1726882429.46450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882429.46491: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882429.46517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882429.46541: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882429.46561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882429.46598: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882429.46609: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882429.46612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882429.46712: Set connection var ansible_pipelining to False 13531 1726882429.46715: Set connection var ansible_timeout to 10 13531 1726882429.46718: Set connection var ansible_shell_executable to /bin/sh 13531 1726882429.46723: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882429.46726: Set connection var ansible_connection to ssh 13531 1726882429.46728: Set connection var ansible_shell_type to sh 13531 1726882429.46748: variable 'ansible_shell_executable' from source: unknown 13531 1726882429.46755: variable 'ansible_connection' from source: unknown 13531 1726882429.46758: variable 'ansible_module_compression' from source: unknown 13531 1726882429.46760: variable 'ansible_shell_type' from source: unknown 13531 1726882429.46762: variable 'ansible_shell_executable' from source: unknown 13531 1726882429.46766: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882429.46771: variable 'ansible_pipelining' from source: unknown 13531 1726882429.46774: variable 'ansible_timeout' from source: unknown 13531 1726882429.46778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882429.46891: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882429.46900: variable 'omit' from source: magic vars 13531 1726882429.46905: starting attempt loop 13531 1726882429.46908: running the handler 13531 1726882429.46946: variable '__network_connections_result' from source: set_fact 13531 1726882429.47008: variable '__network_connections_result' from source: set_fact 13531 1726882429.47123: handler run complete 13531 1726882429.47143: attempt loop complete, returning result 13531 1726882429.47146: _execute() done 13531 1726882429.47148: dumping result to json 13531 1726882429.47153: done dumping result, returning 13531 1726882429.47165: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4fd9-519d-000000000039] 13531 1726882429.47170: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000039 13531 1726882429.47269: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000039 13531 1726882429.47272: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, ed834b6c-cbda-46ef-ae08-dad7f6819810\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, ed834b6c-cbda-46ef-ae08-dad7f6819810 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, ed834b6c-cbda-46ef-ae08-dad7f6819810", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, ed834b6c-cbda-46ef-ae08-dad7f6819810 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78 (not-active)" ] } } 13531 1726882429.47379: no more pending results, returning what we have 13531 1726882429.47383: results queue empty 13531 1726882429.47389: checking for any_errors_fatal 13531 1726882429.47393: done checking for any_errors_fatal 13531 1726882429.47394: checking for max_fail_percentage 13531 1726882429.47396: done checking for max_fail_percentage 13531 1726882429.47396: checking to see if all hosts have failed and the running result is not ok 13531 1726882429.47397: done checking to see if all hosts have failed 13531 1726882429.47398: getting the remaining hosts for this loop 13531 1726882429.47399: done getting the remaining hosts for this loop 13531 1726882429.47402: getting the next task for host managed_node2 13531 1726882429.47407: done getting next task for host managed_node2 13531 1726882429.47410: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13531 1726882429.47413: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882429.47422: getting variables 13531 1726882429.47424: in VariableManager get_vars() 13531 1726882429.47470: Calling all_inventory to load vars for managed_node2 13531 1726882429.47473: Calling groups_inventory to load vars for managed_node2 13531 1726882429.47475: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882429.47484: Calling all_plugins_play to load vars for managed_node2 13531 1726882429.47486: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882429.47488: Calling groups_plugins_play to load vars for managed_node2 13531 1726882429.48286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882429.49704: done with get_vars() 13531 1726882429.49731: done getting variables 13531 1726882429.49793: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:49 -0400 (0:00:00.048) 0:00:17.393 ****** 13531 1726882429.49828: entering _queue_task() for managed_node2/debug 13531 1726882429.50157: worker is 1 (out of 1 available) 13531 1726882429.50171: exiting _queue_task() for managed_node2/debug 13531 1726882429.50185: done queuing things up, now waiting for results queue to drain 13531 1726882429.50186: waiting for pending results... 13531 1726882429.50483: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13531 1726882429.50630: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000003a 13531 1726882429.50653: variable 'ansible_search_path' from source: unknown 13531 1726882429.50663: variable 'ansible_search_path' from source: unknown 13531 1726882429.50717: calling self._execute() 13531 1726882429.50813: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882429.50826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882429.50842: variable 'omit' from source: magic vars 13531 1726882429.51203: variable 'ansible_distribution_major_version' from source: facts 13531 1726882429.51220: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882429.51343: variable 'network_state' from source: role '' defaults 13531 1726882429.51358: Evaluated conditional (network_state != {}): False 13531 1726882429.51368: when evaluation is False, skipping this task 13531 1726882429.51376: _execute() done 13531 1726882429.51383: dumping result to json 13531 1726882429.51392: done dumping result, returning 13531 1726882429.51402: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4fd9-519d-00000000003a] 13531 1726882429.51412: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000003a skipping: [managed_node2] => { "false_condition": "network_state != {}" } 13531 1726882429.51555: no more pending results, returning what we have 13531 1726882429.51560: results queue empty 13531 1726882429.51561: checking for any_errors_fatal 13531 1726882429.51574: done checking for any_errors_fatal 13531 1726882429.51575: checking for max_fail_percentage 13531 1726882429.51577: done checking for max_fail_percentage 13531 1726882429.51578: checking to see if all hosts have failed and the running result is not ok 13531 1726882429.51579: done checking to see if all hosts have failed 13531 1726882429.51580: getting the remaining hosts for this loop 13531 1726882429.51581: done getting the remaining hosts for this loop 13531 1726882429.51585: getting the next task for host managed_node2 13531 1726882429.51592: done getting next task for host managed_node2 13531 1726882429.51596: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13531 1726882429.51600: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882429.51616: getting variables 13531 1726882429.51619: in VariableManager get_vars() 13531 1726882429.51679: Calling all_inventory to load vars for managed_node2 13531 1726882429.51682: Calling groups_inventory to load vars for managed_node2 13531 1726882429.51685: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882429.51699: Calling all_plugins_play to load vars for managed_node2 13531 1726882429.51702: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882429.51705: Calling groups_plugins_play to load vars for managed_node2 13531 1726882429.52683: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000003a 13531 1726882429.52686: WORKER PROCESS EXITING 13531 1726882429.53528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882429.55207: done with get_vars() 13531 1726882429.55234: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:49 -0400 (0:00:00.055) 0:00:17.448 ****** 13531 1726882429.55338: entering _queue_task() for managed_node2/ping 13531 1726882429.55340: Creating lock for ping 13531 1726882429.55681: worker is 1 (out of 1 available) 13531 1726882429.55693: exiting _queue_task() for managed_node2/ping 13531 1726882429.55704: done queuing things up, now waiting for results queue to drain 13531 1726882429.55706: waiting for pending results... 13531 1726882429.55988: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 13531 1726882429.56122: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000003b 13531 1726882429.56142: variable 'ansible_search_path' from source: unknown 13531 1726882429.56150: variable 'ansible_search_path' from source: unknown 13531 1726882429.56192: calling self._execute() 13531 1726882429.56285: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882429.56295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882429.56308: variable 'omit' from source: magic vars 13531 1726882429.56665: variable 'ansible_distribution_major_version' from source: facts 13531 1726882429.56684: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882429.56697: variable 'omit' from source: magic vars 13531 1726882429.56754: variable 'omit' from source: magic vars 13531 1726882429.56794: variable 'omit' from source: magic vars 13531 1726882429.56840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882429.56880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882429.56906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882429.56930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882429.56945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882429.56981: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882429.56989: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882429.56995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882429.57106: Set connection var ansible_pipelining to False 13531 1726882429.57117: Set connection var ansible_timeout to 10 13531 1726882429.57130: Set connection var ansible_shell_executable to /bin/sh 13531 1726882429.57139: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882429.57146: Set connection var ansible_connection to ssh 13531 1726882429.57151: Set connection var ansible_shell_type to sh 13531 1726882429.57183: variable 'ansible_shell_executable' from source: unknown 13531 1726882429.57191: variable 'ansible_connection' from source: unknown 13531 1726882429.57197: variable 'ansible_module_compression' from source: unknown 13531 1726882429.57203: variable 'ansible_shell_type' from source: unknown 13531 1726882429.57208: variable 'ansible_shell_executable' from source: unknown 13531 1726882429.57214: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882429.57220: variable 'ansible_pipelining' from source: unknown 13531 1726882429.57225: variable 'ansible_timeout' from source: unknown 13531 1726882429.57233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882429.57437: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882429.57458: variable 'omit' from source: magic vars 13531 1726882429.57471: starting attempt loop 13531 1726882429.57478: running the handler 13531 1726882429.57496: _low_level_execute_command(): starting 13531 1726882429.57507: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882429.58279: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882429.58295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882429.58311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882429.58335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882429.58383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882429.58395: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882429.58410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882429.58433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882429.58447: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882429.58459: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882429.58475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882429.58490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882429.58507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882429.58521: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882429.58533: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882429.58552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882429.58629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882429.58656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882429.58677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882429.58817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882429.60496: stdout chunk (state=3): >>>/root <<< 13531 1726882429.60699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882429.60702: stdout chunk (state=3): >>><<< 13531 1726882429.60704: stderr chunk (state=3): >>><<< 13531 1726882429.60820: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882429.60823: _low_level_execute_command(): starting 13531 1726882429.60826: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882429.6072478-14309-22259168007276 `" && echo ansible-tmp-1726882429.6072478-14309-22259168007276="` echo /root/.ansible/tmp/ansible-tmp-1726882429.6072478-14309-22259168007276 `" ) && sleep 0' 13531 1726882429.61419: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882429.61432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882429.61445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882429.61470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882429.61515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882429.61526: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882429.61539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882429.61555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882429.61568: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882429.61584: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882429.61595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882429.61608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882429.61622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882429.61632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882429.61641: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882429.61653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882429.61733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882429.61754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882429.61773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882429.61911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882429.63787: stdout chunk (state=3): >>>ansible-tmp-1726882429.6072478-14309-22259168007276=/root/.ansible/tmp/ansible-tmp-1726882429.6072478-14309-22259168007276 <<< 13531 1726882429.63951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882429.63954: stdout chunk (state=3): >>><<< 13531 1726882429.63956: stderr chunk (state=3): >>><<< 13531 1726882429.63975: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882429.6072478-14309-22259168007276=/root/.ansible/tmp/ansible-tmp-1726882429.6072478-14309-22259168007276 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882429.64012: variable 'ansible_module_compression' from source: unknown 13531 1726882429.64044: ANSIBALLZ: Using lock for ping 13531 1726882429.64047: ANSIBALLZ: Acquiring lock 13531 1726882429.64050: ANSIBALLZ: Lock acquired: 139969310547296 13531 1726882429.64055: ANSIBALLZ: Creating module 13531 1726882429.77033: ANSIBALLZ: Writing module into payload 13531 1726882429.77087: ANSIBALLZ: Writing module 13531 1726882429.77104: ANSIBALLZ: Renaming module 13531 1726882429.77110: ANSIBALLZ: Done creating module 13531 1726882429.77144: variable 'ansible_facts' from source: unknown 13531 1726882429.77194: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882429.6072478-14309-22259168007276/AnsiballZ_ping.py 13531 1726882429.77319: Sending initial data 13531 1726882429.77322: Sent initial data (152 bytes) 13531 1726882429.78070: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882429.78076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882429.78106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882429.78119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882429.78177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882429.78183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882429.78194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882429.78311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882429.80221: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882429.80320: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 13531 1726882429.80328: stderr chunk (state=3): >>>debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882429.80428: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpz3qwgijg /root/.ansible/tmp/ansible-tmp-1726882429.6072478-14309-22259168007276/AnsiballZ_ping.py <<< 13531 1726882429.80535: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882429.81786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882429.81945: stderr chunk (state=3): >>><<< 13531 1726882429.81976: stdout chunk (state=3): >>><<< 13531 1726882429.82026: done transferring module to remote 13531 1726882429.82042: _low_level_execute_command(): starting 13531 1726882429.82045: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882429.6072478-14309-22259168007276/ /root/.ansible/tmp/ansible-tmp-1726882429.6072478-14309-22259168007276/AnsiballZ_ping.py && sleep 0' 13531 1726882429.82533: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882429.82539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882429.82576: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882429.82587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882429.82640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882429.82652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882429.82767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882429.84533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882429.84611: stderr chunk (state=3): >>><<< 13531 1726882429.84621: stdout chunk (state=3): >>><<< 13531 1726882429.84654: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882429.84659: _low_level_execute_command(): starting 13531 1726882429.84666: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882429.6072478-14309-22259168007276/AnsiballZ_ping.py && sleep 0' 13531 1726882429.85214: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882429.85220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882429.85260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882429.85299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882429.85351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882429.85365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882429.85377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882429.85525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882429.98474: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13531 1726882429.99576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882429.99580: stdout chunk (state=3): >>><<< 13531 1726882429.99583: stderr chunk (state=3): >>><<< 13531 1726882429.99711: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882429.99715: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882429.6072478-14309-22259168007276/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882429.99718: _low_level_execute_command(): starting 13531 1726882429.99720: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882429.6072478-14309-22259168007276/ > /dev/null 2>&1 && sleep 0' 13531 1726882430.00313: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882430.00328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882430.00344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.00369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.00412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882430.00424: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882430.00438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.00472: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882430.00485: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882430.00496: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882430.00509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882430.00522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.00538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.00550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882430.00568: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882430.00583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.00661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882430.00681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882430.00696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882430.00829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882430.02652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882430.02730: stderr chunk (state=3): >>><<< 13531 1726882430.02734: stdout chunk (state=3): >>><<< 13531 1726882430.02974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882430.02978: handler run complete 13531 1726882430.02980: attempt loop complete, returning result 13531 1726882430.02982: _execute() done 13531 1726882430.02984: dumping result to json 13531 1726882430.02986: done dumping result, returning 13531 1726882430.02988: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4fd9-519d-00000000003b] 13531 1726882430.02990: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000003b 13531 1726882430.03068: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000003b 13531 1726882430.03071: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 13531 1726882430.03138: no more pending results, returning what we have 13531 1726882430.03142: results queue empty 13531 1726882430.03144: checking for any_errors_fatal 13531 1726882430.03150: done checking for any_errors_fatal 13531 1726882430.03151: checking for max_fail_percentage 13531 1726882430.03155: done checking for max_fail_percentage 13531 1726882430.03156: checking to see if all hosts have failed and the running result is not ok 13531 1726882430.03157: done checking to see if all hosts have failed 13531 1726882430.03158: getting the remaining hosts for this loop 13531 1726882430.03159: done getting the remaining hosts for this loop 13531 1726882430.03185: getting the next task for host managed_node2 13531 1726882430.03197: done getting next task for host managed_node2 13531 1726882430.03200: ^ task is: TASK: meta (role_complete) 13531 1726882430.03204: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882430.03215: getting variables 13531 1726882430.03217: in VariableManager get_vars() 13531 1726882430.03306: Calling all_inventory to load vars for managed_node2 13531 1726882430.03309: Calling groups_inventory to load vars for managed_node2 13531 1726882430.03312: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882430.03325: Calling all_plugins_play to load vars for managed_node2 13531 1726882430.03328: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882430.03331: Calling groups_plugins_play to load vars for managed_node2 13531 1726882430.05242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882430.07391: done with get_vars() 13531 1726882430.07436: done getting variables 13531 1726882430.07559: done queuing things up, now waiting for results queue to drain 13531 1726882430.07561: results queue empty 13531 1726882430.07562: checking for any_errors_fatal 13531 1726882430.07567: done checking for any_errors_fatal 13531 1726882430.07568: checking for max_fail_percentage 13531 1726882430.07569: done checking for max_fail_percentage 13531 1726882430.07570: checking to see if all hosts have failed and the running result is not ok 13531 1726882430.07571: done checking to see if all hosts have failed 13531 1726882430.07572: getting the remaining hosts for this loop 13531 1726882430.07573: done getting the remaining hosts for this loop 13531 1726882430.07575: getting the next task for host managed_node2 13531 1726882430.07581: done getting next task for host managed_node2 13531 1726882430.07583: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13531 1726882430.07585: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882430.07588: getting variables 13531 1726882430.07589: in VariableManager get_vars() 13531 1726882430.07612: Calling all_inventory to load vars for managed_node2 13531 1726882430.07614: Calling groups_inventory to load vars for managed_node2 13531 1726882430.07616: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882430.07622: Calling all_plugins_play to load vars for managed_node2 13531 1726882430.07624: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882430.07627: Calling groups_plugins_play to load vars for managed_node2 13531 1726882430.09070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882430.11041: done with get_vars() 13531 1726882430.11074: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:50 -0400 (0:00:00.558) 0:00:18.006 ****** 13531 1726882430.11159: entering _queue_task() for managed_node2/include_tasks 13531 1726882430.11505: worker is 1 (out of 1 available) 13531 1726882430.11519: exiting _queue_task() for managed_node2/include_tasks 13531 1726882430.11532: done queuing things up, now waiting for results queue to drain 13531 1726882430.11533: waiting for pending results... 13531 1726882430.11927: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 13531 1726882430.12119: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000006e 13531 1726882430.12155: variable 'ansible_search_path' from source: unknown 13531 1726882430.12172: variable 'ansible_search_path' from source: unknown 13531 1726882430.12230: calling self._execute() 13531 1726882430.12402: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.12416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.12432: variable 'omit' from source: magic vars 13531 1726882430.12845: variable 'ansible_distribution_major_version' from source: facts 13531 1726882430.12873: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882430.12885: _execute() done 13531 1726882430.12897: dumping result to json 13531 1726882430.12905: done dumping result, returning 13531 1726882430.12915: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-4fd9-519d-00000000006e] 13531 1726882430.12925: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000006e 13531 1726882430.13055: no more pending results, returning what we have 13531 1726882430.13060: in VariableManager get_vars() 13531 1726882430.13127: Calling all_inventory to load vars for managed_node2 13531 1726882430.13130: Calling groups_inventory to load vars for managed_node2 13531 1726882430.13133: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882430.13147: Calling all_plugins_play to load vars for managed_node2 13531 1726882430.13150: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882430.13155: Calling groups_plugins_play to load vars for managed_node2 13531 1726882430.14240: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000006e 13531 1726882430.14243: WORKER PROCESS EXITING 13531 1726882430.15109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882430.16896: done with get_vars() 13531 1726882430.16920: variable 'ansible_search_path' from source: unknown 13531 1726882430.16921: variable 'ansible_search_path' from source: unknown 13531 1726882430.16969: we have included files to process 13531 1726882430.16970: generating all_blocks data 13531 1726882430.16972: done generating all_blocks data 13531 1726882430.16978: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13531 1726882430.16979: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13531 1726882430.16982: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13531 1726882430.17191: done processing included file 13531 1726882430.17193: iterating over new_blocks loaded from include file 13531 1726882430.17195: in VariableManager get_vars() 13531 1726882430.17224: done with get_vars() 13531 1726882430.17226: filtering new block on tags 13531 1726882430.17244: done filtering new block on tags 13531 1726882430.17246: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 13531 1726882430.17251: extending task lists for all hosts with included blocks 13531 1726882430.17368: done extending task lists 13531 1726882430.17370: done processing included files 13531 1726882430.17370: results queue empty 13531 1726882430.17371: checking for any_errors_fatal 13531 1726882430.17373: done checking for any_errors_fatal 13531 1726882430.17378: checking for max_fail_percentage 13531 1726882430.17380: done checking for max_fail_percentage 13531 1726882430.17380: checking to see if all hosts have failed and the running result is not ok 13531 1726882430.17381: done checking to see if all hosts have failed 13531 1726882430.17382: getting the remaining hosts for this loop 13531 1726882430.17383: done getting the remaining hosts for this loop 13531 1726882430.17386: getting the next task for host managed_node2 13531 1726882430.17389: done getting next task for host managed_node2 13531 1726882430.17392: ^ task is: TASK: Get stat for interface {{ interface }} 13531 1726882430.17395: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882430.17397: getting variables 13531 1726882430.17398: in VariableManager get_vars() 13531 1726882430.17417: Calling all_inventory to load vars for managed_node2 13531 1726882430.17419: Calling groups_inventory to load vars for managed_node2 13531 1726882430.17421: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882430.17426: Calling all_plugins_play to load vars for managed_node2 13531 1726882430.17428: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882430.17431: Calling groups_plugins_play to load vars for managed_node2 13531 1726882430.18649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882430.19577: done with get_vars() 13531 1726882430.19592: done getting variables 13531 1726882430.19709: variable 'interface' from source: task vars 13531 1726882430.19712: variable 'controller_device' from source: play vars 13531 1726882430.19751: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:50 -0400 (0:00:00.086) 0:00:18.093 ****** 13531 1726882430.19779: entering _queue_task() for managed_node2/stat 13531 1726882430.20021: worker is 1 (out of 1 available) 13531 1726882430.20145: exiting _queue_task() for managed_node2/stat 13531 1726882430.20161: done queuing things up, now waiting for results queue to drain 13531 1726882430.20163: waiting for pending results... 13531 1726882430.20375: running TaskExecutor() for managed_node2/TASK: Get stat for interface nm-bond 13531 1726882430.20515: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000337 13531 1726882430.20534: variable 'ansible_search_path' from source: unknown 13531 1726882430.20541: variable 'ansible_search_path' from source: unknown 13531 1726882430.20586: calling self._execute() 13531 1726882430.20685: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.20697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.20712: variable 'omit' from source: magic vars 13531 1726882430.21075: variable 'ansible_distribution_major_version' from source: facts 13531 1726882430.21092: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882430.21102: variable 'omit' from source: magic vars 13531 1726882430.21160: variable 'omit' from source: magic vars 13531 1726882430.21257: variable 'interface' from source: task vars 13531 1726882430.21270: variable 'controller_device' from source: play vars 13531 1726882430.21339: variable 'controller_device' from source: play vars 13531 1726882430.21368: variable 'omit' from source: magic vars 13531 1726882430.21442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882430.21807: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882430.21840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882430.21865: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882430.21919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882430.21960: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882430.21969: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.21976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.22095: Set connection var ansible_pipelining to False 13531 1726882430.22106: Set connection var ansible_timeout to 10 13531 1726882430.22115: Set connection var ansible_shell_executable to /bin/sh 13531 1726882430.22123: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882430.22129: Set connection var ansible_connection to ssh 13531 1726882430.22135: Set connection var ansible_shell_type to sh 13531 1726882430.22173: variable 'ansible_shell_executable' from source: unknown 13531 1726882430.22180: variable 'ansible_connection' from source: unknown 13531 1726882430.22186: variable 'ansible_module_compression' from source: unknown 13531 1726882430.22192: variable 'ansible_shell_type' from source: unknown 13531 1726882430.22198: variable 'ansible_shell_executable' from source: unknown 13531 1726882430.22204: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.22211: variable 'ansible_pipelining' from source: unknown 13531 1726882430.22217: variable 'ansible_timeout' from source: unknown 13531 1726882430.22233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.22454: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882430.22476: variable 'omit' from source: magic vars 13531 1726882430.22494: starting attempt loop 13531 1726882430.22503: running the handler 13531 1726882430.22523: _low_level_execute_command(): starting 13531 1726882430.22534: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882430.23352: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882430.23356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.23399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882430.23403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.23405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.23408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.23466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882430.23471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882430.23474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882430.23577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882430.25246: stdout chunk (state=3): >>>/root <<< 13531 1726882430.25437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882430.25441: stdout chunk (state=3): >>><<< 13531 1726882430.25443: stderr chunk (state=3): >>><<< 13531 1726882430.25568: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882430.25571: _low_level_execute_command(): starting 13531 1726882430.25575: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882430.2547097-14342-70524091758299 `" && echo ansible-tmp-1726882430.2547097-14342-70524091758299="` echo /root/.ansible/tmp/ansible-tmp-1726882430.2547097-14342-70524091758299 `" ) && sleep 0' 13531 1726882430.26146: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882430.26159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882430.26178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.26216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.26256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882430.26271: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882430.26284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.26300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882430.26311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882430.26327: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882430.26339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882430.26351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.26368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.26391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882430.26403: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882430.26415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.26510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882430.26530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882430.26552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882430.26692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882430.28588: stdout chunk (state=3): >>>ansible-tmp-1726882430.2547097-14342-70524091758299=/root/.ansible/tmp/ansible-tmp-1726882430.2547097-14342-70524091758299 <<< 13531 1726882430.28787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882430.28791: stdout chunk (state=3): >>><<< 13531 1726882430.28793: stderr chunk (state=3): >>><<< 13531 1726882430.29071: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882430.2547097-14342-70524091758299=/root/.ansible/tmp/ansible-tmp-1726882430.2547097-14342-70524091758299 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882430.29076: variable 'ansible_module_compression' from source: unknown 13531 1726882430.29078: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13531 1726882430.29080: variable 'ansible_facts' from source: unknown 13531 1726882430.29082: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882430.2547097-14342-70524091758299/AnsiballZ_stat.py 13531 1726882430.29207: Sending initial data 13531 1726882430.29216: Sent initial data (152 bytes) 13531 1726882430.30845: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.30849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882430.30900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882430.31030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882430.32795: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882430.32888: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882430.33005: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpejq2ij0v /root/.ansible/tmp/ansible-tmp-1726882430.2547097-14342-70524091758299/AnsiballZ_stat.py <<< 13531 1726882430.33092: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882430.34746: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882430.34768: stderr chunk (state=3): >>><<< 13531 1726882430.34772: stdout chunk (state=3): >>><<< 13531 1726882430.34873: done transferring module to remote 13531 1726882430.34876: _low_level_execute_command(): starting 13531 1726882430.34878: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882430.2547097-14342-70524091758299/ /root/.ansible/tmp/ansible-tmp-1726882430.2547097-14342-70524091758299/AnsiballZ_stat.py && sleep 0' 13531 1726882430.35634: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.35639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.35671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.35692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.35699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.35760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882430.35765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882430.35779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882430.35892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882430.37661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882430.37738: stderr chunk (state=3): >>><<< 13531 1726882430.37742: stdout chunk (state=3): >>><<< 13531 1726882430.37759: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882430.37762: _low_level_execute_command(): starting 13531 1726882430.37771: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882430.2547097-14342-70524091758299/AnsiballZ_stat.py && sleep 0' 13531 1726882430.38455: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882430.38459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.38490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882430.38499: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882430.38505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.38545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882430.38551: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882430.38558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.38572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.38580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882430.38585: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882430.38590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.38637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882430.38670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882430.38672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882430.38772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882430.51907: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28014, "dev": 21, "nlink": 1, "atime": 1726882429.087201, "mtime": 1726882429.087201, "ctime": 1726882429.087201, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13531 1726882430.52880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882430.52934: stderr chunk (state=3): >>><<< 13531 1726882430.52937: stdout chunk (state=3): >>><<< 13531 1726882430.52954: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28014, "dev": 21, "nlink": 1, "atime": 1726882429.087201, "mtime": 1726882429.087201, "ctime": 1726882429.087201, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882430.52996: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882430.2547097-14342-70524091758299/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882430.53004: _low_level_execute_command(): starting 13531 1726882430.53010: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882430.2547097-14342-70524091758299/ > /dev/null 2>&1 && sleep 0' 13531 1726882430.53479: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882430.53483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.53532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882430.53536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.53538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.53589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882430.53601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882430.53710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882430.55519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882430.55575: stderr chunk (state=3): >>><<< 13531 1726882430.55578: stdout chunk (state=3): >>><<< 13531 1726882430.55594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882430.55599: handler run complete 13531 1726882430.55630: attempt loop complete, returning result 13531 1726882430.55633: _execute() done 13531 1726882430.55635: dumping result to json 13531 1726882430.55641: done dumping result, returning 13531 1726882430.55652: done running TaskExecutor() for managed_node2/TASK: Get stat for interface nm-bond [0e448fcc-3ce9-4fd9-519d-000000000337] 13531 1726882430.55660: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000337 13531 1726882430.55772: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000337 13531 1726882430.55775: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882429.087201, "block_size": 4096, "blocks": 0, "ctime": 1726882429.087201, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28014, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1726882429.087201, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13531 1726882430.55863: no more pending results, returning what we have 13531 1726882430.55868: results queue empty 13531 1726882430.55869: checking for any_errors_fatal 13531 1726882430.55871: done checking for any_errors_fatal 13531 1726882430.55872: checking for max_fail_percentage 13531 1726882430.55873: done checking for max_fail_percentage 13531 1726882430.55875: checking to see if all hosts have failed and the running result is not ok 13531 1726882430.55876: done checking to see if all hosts have failed 13531 1726882430.55877: getting the remaining hosts for this loop 13531 1726882430.55878: done getting the remaining hosts for this loop 13531 1726882430.55882: getting the next task for host managed_node2 13531 1726882430.55889: done getting next task for host managed_node2 13531 1726882430.55892: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13531 1726882430.55895: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882430.55899: getting variables 13531 1726882430.55900: in VariableManager get_vars() 13531 1726882430.55948: Calling all_inventory to load vars for managed_node2 13531 1726882430.55951: Calling groups_inventory to load vars for managed_node2 13531 1726882430.55954: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882430.55966: Calling all_plugins_play to load vars for managed_node2 13531 1726882430.55969: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882430.55972: Calling groups_plugins_play to load vars for managed_node2 13531 1726882430.60009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882430.60921: done with get_vars() 13531 1726882430.60938: done getting variables 13531 1726882430.60977: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882430.61050: variable 'interface' from source: task vars 13531 1726882430.61052: variable 'controller_device' from source: play vars 13531 1726882430.61096: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:50 -0400 (0:00:00.413) 0:00:18.506 ****** 13531 1726882430.61122: entering _queue_task() for managed_node2/assert 13531 1726882430.61355: worker is 1 (out of 1 available) 13531 1726882430.61370: exiting _queue_task() for managed_node2/assert 13531 1726882430.61382: done queuing things up, now waiting for results queue to drain 13531 1726882430.61384: waiting for pending results... 13531 1726882430.61578: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'nm-bond' 13531 1726882430.61664: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000006f 13531 1726882430.61677: variable 'ansible_search_path' from source: unknown 13531 1726882430.61680: variable 'ansible_search_path' from source: unknown 13531 1726882430.61715: calling self._execute() 13531 1726882430.61789: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.61793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.61801: variable 'omit' from source: magic vars 13531 1726882430.62078: variable 'ansible_distribution_major_version' from source: facts 13531 1726882430.62088: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882430.62094: variable 'omit' from source: magic vars 13531 1726882430.62127: variable 'omit' from source: magic vars 13531 1726882430.62193: variable 'interface' from source: task vars 13531 1726882430.62197: variable 'controller_device' from source: play vars 13531 1726882430.62243: variable 'controller_device' from source: play vars 13531 1726882430.62258: variable 'omit' from source: magic vars 13531 1726882430.62296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882430.62320: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882430.62341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882430.62356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882430.62370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882430.62394: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882430.62397: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.62400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.62477: Set connection var ansible_pipelining to False 13531 1726882430.62481: Set connection var ansible_timeout to 10 13531 1726882430.62487: Set connection var ansible_shell_executable to /bin/sh 13531 1726882430.62492: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882430.62495: Set connection var ansible_connection to ssh 13531 1726882430.62497: Set connection var ansible_shell_type to sh 13531 1726882430.62517: variable 'ansible_shell_executable' from source: unknown 13531 1726882430.62520: variable 'ansible_connection' from source: unknown 13531 1726882430.62523: variable 'ansible_module_compression' from source: unknown 13531 1726882430.62525: variable 'ansible_shell_type' from source: unknown 13531 1726882430.62528: variable 'ansible_shell_executable' from source: unknown 13531 1726882430.62530: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.62533: variable 'ansible_pipelining' from source: unknown 13531 1726882430.62537: variable 'ansible_timeout' from source: unknown 13531 1726882430.62539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.62639: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882430.62649: variable 'omit' from source: magic vars 13531 1726882430.62656: starting attempt loop 13531 1726882430.62660: running the handler 13531 1726882430.62757: variable 'interface_stat' from source: set_fact 13531 1726882430.62776: Evaluated conditional (interface_stat.stat.exists): True 13531 1726882430.62779: handler run complete 13531 1726882430.62793: attempt loop complete, returning result 13531 1726882430.62796: _execute() done 13531 1726882430.62799: dumping result to json 13531 1726882430.62802: done dumping result, returning 13531 1726882430.62807: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'nm-bond' [0e448fcc-3ce9-4fd9-519d-00000000006f] 13531 1726882430.62815: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000006f 13531 1726882430.62900: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000006f 13531 1726882430.62903: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882430.62958: no more pending results, returning what we have 13531 1726882430.62961: results queue empty 13531 1726882430.62962: checking for any_errors_fatal 13531 1726882430.62978: done checking for any_errors_fatal 13531 1726882430.62979: checking for max_fail_percentage 13531 1726882430.62981: done checking for max_fail_percentage 13531 1726882430.62982: checking to see if all hosts have failed and the running result is not ok 13531 1726882430.62982: done checking to see if all hosts have failed 13531 1726882430.62983: getting the remaining hosts for this loop 13531 1726882430.62984: done getting the remaining hosts for this loop 13531 1726882430.62987: getting the next task for host managed_node2 13531 1726882430.62994: done getting next task for host managed_node2 13531 1726882430.62997: ^ task is: TASK: Include the task 'assert_profile_present.yml' 13531 1726882430.63003: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882430.63006: getting variables 13531 1726882430.63008: in VariableManager get_vars() 13531 1726882430.63052: Calling all_inventory to load vars for managed_node2 13531 1726882430.63054: Calling groups_inventory to load vars for managed_node2 13531 1726882430.63056: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882430.63068: Calling all_plugins_play to load vars for managed_node2 13531 1726882430.63071: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882430.63074: Calling groups_plugins_play to load vars for managed_node2 13531 1726882430.63882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882430.64907: done with get_vars() 13531 1726882430.64921: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:67 Friday 20 September 2024 21:33:50 -0400 (0:00:00.038) 0:00:18.545 ****** 13531 1726882430.64987: entering _queue_task() for managed_node2/include_tasks 13531 1726882430.65194: worker is 1 (out of 1 available) 13531 1726882430.65207: exiting _queue_task() for managed_node2/include_tasks 13531 1726882430.65221: done queuing things up, now waiting for results queue to drain 13531 1726882430.65222: waiting for pending results... 13531 1726882430.65400: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' 13531 1726882430.65472: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000070 13531 1726882430.65479: variable 'ansible_search_path' from source: unknown 13531 1726882430.65519: variable 'controller_profile' from source: play vars 13531 1726882430.65666: variable 'controller_profile' from source: play vars 13531 1726882430.65681: variable 'port1_profile' from source: play vars 13531 1726882430.65730: variable 'port1_profile' from source: play vars 13531 1726882430.65735: variable 'port2_profile' from source: play vars 13531 1726882430.65785: variable 'port2_profile' from source: play vars 13531 1726882430.65798: variable 'omit' from source: magic vars 13531 1726882430.65899: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.65909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.65917: variable 'omit' from source: magic vars 13531 1726882430.66089: variable 'ansible_distribution_major_version' from source: facts 13531 1726882430.66097: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882430.66120: variable 'item' from source: unknown 13531 1726882430.66170: variable 'item' from source: unknown 13531 1726882430.66283: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.66286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.66289: variable 'omit' from source: magic vars 13531 1726882430.66373: variable 'ansible_distribution_major_version' from source: facts 13531 1726882430.66377: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882430.66396: variable 'item' from source: unknown 13531 1726882430.66440: variable 'item' from source: unknown 13531 1726882430.66512: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.66515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.66518: variable 'omit' from source: magic vars 13531 1726882430.66620: variable 'ansible_distribution_major_version' from source: facts 13531 1726882430.66623: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882430.66639: variable 'item' from source: unknown 13531 1726882430.66685: variable 'item' from source: unknown 13531 1726882430.66750: dumping result to json 13531 1726882430.66755: done dumping result, returning 13531 1726882430.66758: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' [0e448fcc-3ce9-4fd9-519d-000000000070] 13531 1726882430.66759: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000070 13531 1726882430.66803: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000070 13531 1726882430.66805: WORKER PROCESS EXITING 13531 1726882430.66830: no more pending results, returning what we have 13531 1726882430.66839: in VariableManager get_vars() 13531 1726882430.66895: Calling all_inventory to load vars for managed_node2 13531 1726882430.66898: Calling groups_inventory to load vars for managed_node2 13531 1726882430.66900: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882430.66910: Calling all_plugins_play to load vars for managed_node2 13531 1726882430.66912: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882430.66915: Calling groups_plugins_play to load vars for managed_node2 13531 1726882430.67705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882430.68644: done with get_vars() 13531 1726882430.68660: variable 'ansible_search_path' from source: unknown 13531 1726882430.68674: variable 'ansible_search_path' from source: unknown 13531 1726882430.68680: variable 'ansible_search_path' from source: unknown 13531 1726882430.68684: we have included files to process 13531 1726882430.68684: generating all_blocks data 13531 1726882430.68685: done generating all_blocks data 13531 1726882430.68689: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13531 1726882430.68690: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13531 1726882430.68691: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13531 1726882430.68814: in VariableManager get_vars() 13531 1726882430.68835: done with get_vars() 13531 1726882430.69004: done processing included file 13531 1726882430.69006: iterating over new_blocks loaded from include file 13531 1726882430.69007: in VariableManager get_vars() 13531 1726882430.69022: done with get_vars() 13531 1726882430.69023: filtering new block on tags 13531 1726882430.69035: done filtering new block on tags 13531 1726882430.69037: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0) 13531 1726882430.69039: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13531 1726882430.69040: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13531 1726882430.69042: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13531 1726882430.69106: in VariableManager get_vars() 13531 1726882430.69125: done with get_vars() 13531 1726882430.69273: done processing included file 13531 1726882430.69275: iterating over new_blocks loaded from include file 13531 1726882430.69276: in VariableManager get_vars() 13531 1726882430.69290: done with get_vars() 13531 1726882430.69291: filtering new block on tags 13531 1726882430.69302: done filtering new block on tags 13531 1726882430.69303: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0.0) 13531 1726882430.69305: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13531 1726882430.69306: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13531 1726882430.69307: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13531 1726882430.69404: in VariableManager get_vars() 13531 1726882430.69421: done with get_vars() 13531 1726882430.69568: done processing included file 13531 1726882430.69569: iterating over new_blocks loaded from include file 13531 1726882430.69570: in VariableManager get_vars() 13531 1726882430.69585: done with get_vars() 13531 1726882430.69586: filtering new block on tags 13531 1726882430.69598: done filtering new block on tags 13531 1726882430.69599: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0.1) 13531 1726882430.69601: extending task lists for all hosts with included blocks 13531 1726882430.72799: done extending task lists 13531 1726882430.72805: done processing included files 13531 1726882430.72806: results queue empty 13531 1726882430.72806: checking for any_errors_fatal 13531 1726882430.72809: done checking for any_errors_fatal 13531 1726882430.72809: checking for max_fail_percentage 13531 1726882430.72810: done checking for max_fail_percentage 13531 1726882430.72810: checking to see if all hosts have failed and the running result is not ok 13531 1726882430.72811: done checking to see if all hosts have failed 13531 1726882430.72811: getting the remaining hosts for this loop 13531 1726882430.72812: done getting the remaining hosts for this loop 13531 1726882430.72814: getting the next task for host managed_node2 13531 1726882430.72816: done getting next task for host managed_node2 13531 1726882430.72818: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13531 1726882430.72819: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882430.72821: getting variables 13531 1726882430.72821: in VariableManager get_vars() 13531 1726882430.72835: Calling all_inventory to load vars for managed_node2 13531 1726882430.72837: Calling groups_inventory to load vars for managed_node2 13531 1726882430.72839: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882430.72844: Calling all_plugins_play to load vars for managed_node2 13531 1726882430.72846: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882430.72847: Calling groups_plugins_play to load vars for managed_node2 13531 1726882430.73547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882430.75121: done with get_vars() 13531 1726882430.75148: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:50 -0400 (0:00:00.102) 0:00:18.647 ****** 13531 1726882430.75235: entering _queue_task() for managed_node2/include_tasks 13531 1726882430.75591: worker is 1 (out of 1 available) 13531 1726882430.75601: exiting _queue_task() for managed_node2/include_tasks 13531 1726882430.75614: done queuing things up, now waiting for results queue to drain 13531 1726882430.75615: waiting for pending results... 13531 1726882430.75920: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 13531 1726882430.76036: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000355 13531 1726882430.76068: variable 'ansible_search_path' from source: unknown 13531 1726882430.76078: variable 'ansible_search_path' from source: unknown 13531 1726882430.76120: calling self._execute() 13531 1726882430.76227: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.76240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.76259: variable 'omit' from source: magic vars 13531 1726882430.76676: variable 'ansible_distribution_major_version' from source: facts 13531 1726882430.76694: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882430.76705: _execute() done 13531 1726882430.76718: dumping result to json 13531 1726882430.76725: done dumping result, returning 13531 1726882430.76734: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-4fd9-519d-000000000355] 13531 1726882430.76746: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000355 13531 1726882430.76845: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000355 13531 1726882430.76848: WORKER PROCESS EXITING 13531 1726882430.76877: no more pending results, returning what we have 13531 1726882430.76883: in VariableManager get_vars() 13531 1726882430.76945: Calling all_inventory to load vars for managed_node2 13531 1726882430.76948: Calling groups_inventory to load vars for managed_node2 13531 1726882430.76950: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882430.76967: Calling all_plugins_play to load vars for managed_node2 13531 1726882430.76970: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882430.76973: Calling groups_plugins_play to load vars for managed_node2 13531 1726882430.77917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882430.78861: done with get_vars() 13531 1726882430.78876: variable 'ansible_search_path' from source: unknown 13531 1726882430.78877: variable 'ansible_search_path' from source: unknown 13531 1726882430.78903: we have included files to process 13531 1726882430.78904: generating all_blocks data 13531 1726882430.78905: done generating all_blocks data 13531 1726882430.78906: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13531 1726882430.78907: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13531 1726882430.78908: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13531 1726882430.79581: done processing included file 13531 1726882430.79582: iterating over new_blocks loaded from include file 13531 1726882430.79583: in VariableManager get_vars() 13531 1726882430.79602: done with get_vars() 13531 1726882430.79603: filtering new block on tags 13531 1726882430.79619: done filtering new block on tags 13531 1726882430.79621: in VariableManager get_vars() 13531 1726882430.79637: done with get_vars() 13531 1726882430.79638: filtering new block on tags 13531 1726882430.79650: done filtering new block on tags 13531 1726882430.79651: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 13531 1726882430.79657: extending task lists for all hosts with included blocks 13531 1726882430.79761: done extending task lists 13531 1726882430.79762: done processing included files 13531 1726882430.79762: results queue empty 13531 1726882430.79763: checking for any_errors_fatal 13531 1726882430.79766: done checking for any_errors_fatal 13531 1726882430.79767: checking for max_fail_percentage 13531 1726882430.79768: done checking for max_fail_percentage 13531 1726882430.79768: checking to see if all hosts have failed and the running result is not ok 13531 1726882430.79769: done checking to see if all hosts have failed 13531 1726882430.79769: getting the remaining hosts for this loop 13531 1726882430.79770: done getting the remaining hosts for this loop 13531 1726882430.79772: getting the next task for host managed_node2 13531 1726882430.79774: done getting next task for host managed_node2 13531 1726882430.79775: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13531 1726882430.79777: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882430.79779: getting variables 13531 1726882430.79779: in VariableManager get_vars() 13531 1726882430.79824: Calling all_inventory to load vars for managed_node2 13531 1726882430.79826: Calling groups_inventory to load vars for managed_node2 13531 1726882430.79828: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882430.79834: Calling all_plugins_play to load vars for managed_node2 13531 1726882430.79835: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882430.79837: Calling groups_plugins_play to load vars for managed_node2 13531 1726882430.80504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882430.81477: done with get_vars() 13531 1726882430.81491: done getting variables 13531 1726882430.81519: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:50 -0400 (0:00:00.063) 0:00:18.710 ****** 13531 1726882430.81540: entering _queue_task() for managed_node2/set_fact 13531 1726882430.81784: worker is 1 (out of 1 available) 13531 1726882430.81797: exiting _queue_task() for managed_node2/set_fact 13531 1726882430.81810: done queuing things up, now waiting for results queue to drain 13531 1726882430.81811: waiting for pending results... 13531 1726882430.81996: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 13531 1726882430.82071: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000005e4 13531 1726882430.82081: variable 'ansible_search_path' from source: unknown 13531 1726882430.82085: variable 'ansible_search_path' from source: unknown 13531 1726882430.82114: calling self._execute() 13531 1726882430.82190: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.82195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.82203: variable 'omit' from source: magic vars 13531 1726882430.82488: variable 'ansible_distribution_major_version' from source: facts 13531 1726882430.82498: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882430.82504: variable 'omit' from source: magic vars 13531 1726882430.82531: variable 'omit' from source: magic vars 13531 1726882430.82559: variable 'omit' from source: magic vars 13531 1726882430.82594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882430.82620: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882430.82636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882430.82652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882430.82667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882430.82690: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882430.82693: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.82696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.82764: Set connection var ansible_pipelining to False 13531 1726882430.82768: Set connection var ansible_timeout to 10 13531 1726882430.82774: Set connection var ansible_shell_executable to /bin/sh 13531 1726882430.82783: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882430.82786: Set connection var ansible_connection to ssh 13531 1726882430.82788: Set connection var ansible_shell_type to sh 13531 1726882430.82809: variable 'ansible_shell_executable' from source: unknown 13531 1726882430.82812: variable 'ansible_connection' from source: unknown 13531 1726882430.82815: variable 'ansible_module_compression' from source: unknown 13531 1726882430.82817: variable 'ansible_shell_type' from source: unknown 13531 1726882430.82820: variable 'ansible_shell_executable' from source: unknown 13531 1726882430.82822: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.82826: variable 'ansible_pipelining' from source: unknown 13531 1726882430.82829: variable 'ansible_timeout' from source: unknown 13531 1726882430.82832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.82936: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882430.82945: variable 'omit' from source: magic vars 13531 1726882430.82950: starting attempt loop 13531 1726882430.82955: running the handler 13531 1726882430.82965: handler run complete 13531 1726882430.82974: attempt loop complete, returning result 13531 1726882430.82977: _execute() done 13531 1726882430.82979: dumping result to json 13531 1726882430.82982: done dumping result, returning 13531 1726882430.82989: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-4fd9-519d-0000000005e4] 13531 1726882430.82999: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005e4 13531 1726882430.83083: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005e4 13531 1726882430.83085: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13531 1726882430.83172: no more pending results, returning what we have 13531 1726882430.83175: results queue empty 13531 1726882430.83176: checking for any_errors_fatal 13531 1726882430.83178: done checking for any_errors_fatal 13531 1726882430.83178: checking for max_fail_percentage 13531 1726882430.83180: done checking for max_fail_percentage 13531 1726882430.83181: checking to see if all hosts have failed and the running result is not ok 13531 1726882430.83182: done checking to see if all hosts have failed 13531 1726882430.83182: getting the remaining hosts for this loop 13531 1726882430.83183: done getting the remaining hosts for this loop 13531 1726882430.83187: getting the next task for host managed_node2 13531 1726882430.83192: done getting next task for host managed_node2 13531 1726882430.83195: ^ task is: TASK: Stat profile file 13531 1726882430.83199: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882430.83203: getting variables 13531 1726882430.83210: in VariableManager get_vars() 13531 1726882430.83258: Calling all_inventory to load vars for managed_node2 13531 1726882430.83261: Calling groups_inventory to load vars for managed_node2 13531 1726882430.83265: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882430.83275: Calling all_plugins_play to load vars for managed_node2 13531 1726882430.83277: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882430.83279: Calling groups_plugins_play to load vars for managed_node2 13531 1726882430.84110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882430.85045: done with get_vars() 13531 1726882430.85067: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:50 -0400 (0:00:00.035) 0:00:18.746 ****** 13531 1726882430.85133: entering _queue_task() for managed_node2/stat 13531 1726882430.85377: worker is 1 (out of 1 available) 13531 1726882430.85389: exiting _queue_task() for managed_node2/stat 13531 1726882430.85401: done queuing things up, now waiting for results queue to drain 13531 1726882430.85402: waiting for pending results... 13531 1726882430.85654: running TaskExecutor() for managed_node2/TASK: Stat profile file 13531 1726882430.85726: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000005e5 13531 1726882430.85742: variable 'ansible_search_path' from source: unknown 13531 1726882430.85745: variable 'ansible_search_path' from source: unknown 13531 1726882430.85779: calling self._execute() 13531 1726882430.85848: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.85852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.85866: variable 'omit' from source: magic vars 13531 1726882430.86149: variable 'ansible_distribution_major_version' from source: facts 13531 1726882430.86162: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882430.86168: variable 'omit' from source: magic vars 13531 1726882430.86203: variable 'omit' from source: magic vars 13531 1726882430.86271: variable 'profile' from source: include params 13531 1726882430.86281: variable 'item' from source: include params 13531 1726882430.86327: variable 'item' from source: include params 13531 1726882430.86342: variable 'omit' from source: magic vars 13531 1726882430.86380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882430.86410: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882430.86426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882430.86439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882430.86448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882430.86475: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882430.86478: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.86481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.86551: Set connection var ansible_pipelining to False 13531 1726882430.86557: Set connection var ansible_timeout to 10 13531 1726882430.86559: Set connection var ansible_shell_executable to /bin/sh 13531 1726882430.86566: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882430.86569: Set connection var ansible_connection to ssh 13531 1726882430.86571: Set connection var ansible_shell_type to sh 13531 1726882430.86592: variable 'ansible_shell_executable' from source: unknown 13531 1726882430.86594: variable 'ansible_connection' from source: unknown 13531 1726882430.86597: variable 'ansible_module_compression' from source: unknown 13531 1726882430.86599: variable 'ansible_shell_type' from source: unknown 13531 1726882430.86602: variable 'ansible_shell_executable' from source: unknown 13531 1726882430.86604: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882430.86607: variable 'ansible_pipelining' from source: unknown 13531 1726882430.86610: variable 'ansible_timeout' from source: unknown 13531 1726882430.86612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882430.86759: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882430.86767: variable 'omit' from source: magic vars 13531 1726882430.86773: starting attempt loop 13531 1726882430.86776: running the handler 13531 1726882430.86787: _low_level_execute_command(): starting 13531 1726882430.86794: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882430.87385: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882430.87388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882430.87575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.87579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.87582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882430.87584: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882430.87586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.87588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882430.87590: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882430.87592: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882430.87594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882430.87596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.87598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.87600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882430.87602: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882430.87604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.87607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882430.87612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882430.87625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882430.87763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882430.89422: stdout chunk (state=3): >>>/root <<< 13531 1726882430.89530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882430.89575: stderr chunk (state=3): >>><<< 13531 1726882430.89578: stdout chunk (state=3): >>><<< 13531 1726882430.89596: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882430.89607: _low_level_execute_command(): starting 13531 1726882430.89612: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882430.8959565-14374-17363993778336 `" && echo ansible-tmp-1726882430.8959565-14374-17363993778336="` echo /root/.ansible/tmp/ansible-tmp-1726882430.8959565-14374-17363993778336 `" ) && sleep 0' 13531 1726882430.90054: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.90060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.90095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.90099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.90101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.90150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882430.90153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882430.90267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882430.92137: stdout chunk (state=3): >>>ansible-tmp-1726882430.8959565-14374-17363993778336=/root/.ansible/tmp/ansible-tmp-1726882430.8959565-14374-17363993778336 <<< 13531 1726882430.92248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882430.92320: stderr chunk (state=3): >>><<< 13531 1726882430.92323: stdout chunk (state=3): >>><<< 13531 1726882430.92471: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882430.8959565-14374-17363993778336=/root/.ansible/tmp/ansible-tmp-1726882430.8959565-14374-17363993778336 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882430.92475: variable 'ansible_module_compression' from source: unknown 13531 1726882430.92477: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13531 1726882430.92580: variable 'ansible_facts' from source: unknown 13531 1726882430.92583: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882430.8959565-14374-17363993778336/AnsiballZ_stat.py 13531 1726882430.92714: Sending initial data 13531 1726882430.92717: Sent initial data (152 bytes) 13531 1726882430.93701: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882430.93716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882430.93731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.93750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.93801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882430.93814: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882430.93828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.93846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882430.93859: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882430.93875: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882430.93891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882430.93905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.93921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.93934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882430.93946: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882430.93960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.94039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882430.94055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882430.94072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882430.94204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882430.95966: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882430.96060: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882430.96166: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpfqu6swsf /root/.ansible/tmp/ansible-tmp-1726882430.8959565-14374-17363993778336/AnsiballZ_stat.py <<< 13531 1726882430.96258: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882430.97882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882430.98067: stderr chunk (state=3): >>><<< 13531 1726882430.98070: stdout chunk (state=3): >>><<< 13531 1726882430.98072: done transferring module to remote 13531 1726882430.98075: _low_level_execute_command(): starting 13531 1726882430.98077: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882430.8959565-14374-17363993778336/ /root/.ansible/tmp/ansible-tmp-1726882430.8959565-14374-17363993778336/AnsiballZ_stat.py && sleep 0' 13531 1726882430.98769: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882430.98777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882430.98797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.98819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882430.98822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882430.98897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882430.98900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882430.98905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882430.99011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882431.00786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882431.00872: stderr chunk (state=3): >>><<< 13531 1726882431.00878: stdout chunk (state=3): >>><<< 13531 1726882431.00987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882431.00991: _low_level_execute_command(): starting 13531 1726882431.00994: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882430.8959565-14374-17363993778336/AnsiballZ_stat.py && sleep 0' 13531 1726882431.01633: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882431.01647: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.01662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882431.01683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.01729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.01742: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882431.01757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.01778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882431.01791: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882431.01802: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882431.01815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.01834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882431.01850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.01869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.01881: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882431.01895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.01971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882431.01993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882431.02008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882431.02137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882431.15192: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13531 1726882431.16188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882431.16282: stderr chunk (state=3): >>><<< 13531 1726882431.16285: stdout chunk (state=3): >>><<< 13531 1726882431.16416: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882431.16421: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882430.8959565-14374-17363993778336/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882431.16423: _low_level_execute_command(): starting 13531 1726882431.16426: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882430.8959565-14374-17363993778336/ > /dev/null 2>&1 && sleep 0' 13531 1726882431.17002: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882431.17023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.17307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882431.17328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.17405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.17418: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882431.17433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.17450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882431.17472: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882431.17485: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882431.17502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.17533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882431.17552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.17567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.17587: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882431.17605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.17701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882431.17732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882431.17752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882431.17892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882431.19804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882431.19807: stdout chunk (state=3): >>><<< 13531 1726882431.19810: stderr chunk (state=3): >>><<< 13531 1726882431.20172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882431.20176: handler run complete 13531 1726882431.20178: attempt loop complete, returning result 13531 1726882431.20180: _execute() done 13531 1726882431.20182: dumping result to json 13531 1726882431.20184: done dumping result, returning 13531 1726882431.20186: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-4fd9-519d-0000000005e5] 13531 1726882431.20188: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005e5 13531 1726882431.20258: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005e5 13531 1726882431.20261: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 13531 1726882431.20326: no more pending results, returning what we have 13531 1726882431.20329: results queue empty 13531 1726882431.20330: checking for any_errors_fatal 13531 1726882431.20336: done checking for any_errors_fatal 13531 1726882431.20337: checking for max_fail_percentage 13531 1726882431.20339: done checking for max_fail_percentage 13531 1726882431.20340: checking to see if all hosts have failed and the running result is not ok 13531 1726882431.20340: done checking to see if all hosts have failed 13531 1726882431.20341: getting the remaining hosts for this loop 13531 1726882431.20342: done getting the remaining hosts for this loop 13531 1726882431.20345: getting the next task for host managed_node2 13531 1726882431.20351: done getting next task for host managed_node2 13531 1726882431.20353: ^ task is: TASK: Set NM profile exist flag based on the profile files 13531 1726882431.20357: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882431.20360: getting variables 13531 1726882431.20362: in VariableManager get_vars() 13531 1726882431.20415: Calling all_inventory to load vars for managed_node2 13531 1726882431.20417: Calling groups_inventory to load vars for managed_node2 13531 1726882431.20419: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882431.20430: Calling all_plugins_play to load vars for managed_node2 13531 1726882431.20432: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882431.20435: Calling groups_plugins_play to load vars for managed_node2 13531 1726882431.21985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882431.23738: done with get_vars() 13531 1726882431.23767: done getting variables 13531 1726882431.23832: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:51 -0400 (0:00:00.387) 0:00:19.133 ****** 13531 1726882431.23870: entering _queue_task() for managed_node2/set_fact 13531 1726882431.24205: worker is 1 (out of 1 available) 13531 1726882431.24216: exiting _queue_task() for managed_node2/set_fact 13531 1726882431.24228: done queuing things up, now waiting for results queue to drain 13531 1726882431.24229: waiting for pending results... 13531 1726882431.24517: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 13531 1726882431.24638: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000005e6 13531 1726882431.24657: variable 'ansible_search_path' from source: unknown 13531 1726882431.24667: variable 'ansible_search_path' from source: unknown 13531 1726882431.24712: calling self._execute() 13531 1726882431.24814: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882431.24825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882431.24838: variable 'omit' from source: magic vars 13531 1726882431.25238: variable 'ansible_distribution_major_version' from source: facts 13531 1726882431.25255: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882431.25389: variable 'profile_stat' from source: set_fact 13531 1726882431.25407: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882431.25413: when evaluation is False, skipping this task 13531 1726882431.25419: _execute() done 13531 1726882431.25424: dumping result to json 13531 1726882431.25436: done dumping result, returning 13531 1726882431.25448: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-4fd9-519d-0000000005e6] 13531 1726882431.25460: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005e6 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882431.25609: no more pending results, returning what we have 13531 1726882431.25613: results queue empty 13531 1726882431.25614: checking for any_errors_fatal 13531 1726882431.25624: done checking for any_errors_fatal 13531 1726882431.25625: checking for max_fail_percentage 13531 1726882431.25627: done checking for max_fail_percentage 13531 1726882431.25628: checking to see if all hosts have failed and the running result is not ok 13531 1726882431.25629: done checking to see if all hosts have failed 13531 1726882431.25630: getting the remaining hosts for this loop 13531 1726882431.25631: done getting the remaining hosts for this loop 13531 1726882431.25635: getting the next task for host managed_node2 13531 1726882431.25642: done getting next task for host managed_node2 13531 1726882431.25645: ^ task is: TASK: Get NM profile info 13531 1726882431.25650: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882431.25654: getting variables 13531 1726882431.25657: in VariableManager get_vars() 13531 1726882431.25719: Calling all_inventory to load vars for managed_node2 13531 1726882431.25722: Calling groups_inventory to load vars for managed_node2 13531 1726882431.25724: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882431.25739: Calling all_plugins_play to load vars for managed_node2 13531 1726882431.25742: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882431.25745: Calling groups_plugins_play to load vars for managed_node2 13531 1726882431.26703: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005e6 13531 1726882431.26707: WORKER PROCESS EXITING 13531 1726882431.27478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882431.29253: done with get_vars() 13531 1726882431.29288: done getting variables 13531 1726882431.29358: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:51 -0400 (0:00:00.055) 0:00:19.189 ****** 13531 1726882431.29393: entering _queue_task() for managed_node2/shell 13531 1726882431.29748: worker is 1 (out of 1 available) 13531 1726882431.29759: exiting _queue_task() for managed_node2/shell 13531 1726882431.29780: done queuing things up, now waiting for results queue to drain 13531 1726882431.29782: waiting for pending results... 13531 1726882431.30083: running TaskExecutor() for managed_node2/TASK: Get NM profile info 13531 1726882431.30203: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000005e7 13531 1726882431.30228: variable 'ansible_search_path' from source: unknown 13531 1726882431.30234: variable 'ansible_search_path' from source: unknown 13531 1726882431.30276: calling self._execute() 13531 1726882431.30376: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882431.30386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882431.30397: variable 'omit' from source: magic vars 13531 1726882431.30784: variable 'ansible_distribution_major_version' from source: facts 13531 1726882431.30801: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882431.30810: variable 'omit' from source: magic vars 13531 1726882431.30860: variable 'omit' from source: magic vars 13531 1726882431.30965: variable 'profile' from source: include params 13531 1726882431.30976: variable 'item' from source: include params 13531 1726882431.31042: variable 'item' from source: include params 13531 1726882431.31067: variable 'omit' from source: magic vars 13531 1726882431.31120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882431.31159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882431.31187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882431.31212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882431.31228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882431.31262: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882431.31273: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882431.31281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882431.31390: Set connection var ansible_pipelining to False 13531 1726882431.31402: Set connection var ansible_timeout to 10 13531 1726882431.31416: Set connection var ansible_shell_executable to /bin/sh 13531 1726882431.31427: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882431.31434: Set connection var ansible_connection to ssh 13531 1726882431.31441: Set connection var ansible_shell_type to sh 13531 1726882431.31475: variable 'ansible_shell_executable' from source: unknown 13531 1726882431.31483: variable 'ansible_connection' from source: unknown 13531 1726882431.31490: variable 'ansible_module_compression' from source: unknown 13531 1726882431.31496: variable 'ansible_shell_type' from source: unknown 13531 1726882431.31502: variable 'ansible_shell_executable' from source: unknown 13531 1726882431.31508: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882431.31515: variable 'ansible_pipelining' from source: unknown 13531 1726882431.31526: variable 'ansible_timeout' from source: unknown 13531 1726882431.31534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882431.31679: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882431.31696: variable 'omit' from source: magic vars 13531 1726882431.31706: starting attempt loop 13531 1726882431.31713: running the handler 13531 1726882431.31727: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882431.31756: _low_level_execute_command(): starting 13531 1726882431.31771: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882431.32535: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882431.32550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.32570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882431.32590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.32637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.32649: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882431.32665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.32684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882431.32696: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882431.32706: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882431.32721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.32735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882431.32750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.32762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.32776: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882431.32790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.32871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882431.32897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882431.32915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882431.33057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882431.34730: stdout chunk (state=3): >>>/root <<< 13531 1726882431.34833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882431.34920: stderr chunk (state=3): >>><<< 13531 1726882431.34931: stdout chunk (state=3): >>><<< 13531 1726882431.35055: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882431.35069: _low_level_execute_command(): starting 13531 1726882431.35073: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882431.3496106-14398-193692478427469 `" && echo ansible-tmp-1726882431.3496106-14398-193692478427469="` echo /root/.ansible/tmp/ansible-tmp-1726882431.3496106-14398-193692478427469 `" ) && sleep 0' 13531 1726882431.35722: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.35726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.35774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882431.35777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.35779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.35781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.35784: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.35849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882431.35852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882431.35854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882431.35972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882431.37858: stdout chunk (state=3): >>>ansible-tmp-1726882431.3496106-14398-193692478427469=/root/.ansible/tmp/ansible-tmp-1726882431.3496106-14398-193692478427469 <<< 13531 1726882431.37964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882431.38062: stderr chunk (state=3): >>><<< 13531 1726882431.38081: stdout chunk (state=3): >>><<< 13531 1726882431.38339: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882431.3496106-14398-193692478427469=/root/.ansible/tmp/ansible-tmp-1726882431.3496106-14398-193692478427469 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882431.38342: variable 'ansible_module_compression' from source: unknown 13531 1726882431.38344: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13531 1726882431.38346: variable 'ansible_facts' from source: unknown 13531 1726882431.38348: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882431.3496106-14398-193692478427469/AnsiballZ_command.py 13531 1726882431.38532: Sending initial data 13531 1726882431.38535: Sent initial data (156 bytes) 13531 1726882431.39846: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882431.39871: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.39894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882431.39913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.39961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.39980: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882431.40002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.40020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882431.40032: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882431.40044: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882431.40060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.40078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882431.40105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.40118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.40130: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882431.40144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.40235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882431.40263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882431.40284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882431.40420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882431.42199: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882431.42285: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882431.42388: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpuk8r4anr /root/.ansible/tmp/ansible-tmp-1726882431.3496106-14398-193692478427469/AnsiballZ_command.py <<< 13531 1726882431.42484: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882431.43810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882431.43963: stderr chunk (state=3): >>><<< 13531 1726882431.43968: stdout chunk (state=3): >>><<< 13531 1726882431.43991: done transferring module to remote 13531 1726882431.44002: _low_level_execute_command(): starting 13531 1726882431.44009: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882431.3496106-14398-193692478427469/ /root/.ansible/tmp/ansible-tmp-1726882431.3496106-14398-193692478427469/AnsiballZ_command.py && sleep 0' 13531 1726882431.44630: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882431.44638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.44649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882431.44665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.44705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.44712: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882431.44722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.44736: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882431.44744: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882431.44750: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882431.44757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.44769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882431.44781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.44788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.44794: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882431.44804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.44872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882431.44890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882431.44900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882431.45023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882431.46877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882431.46881: stdout chunk (state=3): >>><<< 13531 1726882431.46887: stderr chunk (state=3): >>><<< 13531 1726882431.46903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882431.46905: _low_level_execute_command(): starting 13531 1726882431.46911: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882431.3496106-14398-193692478427469/AnsiballZ_command.py && sleep 0' 13531 1726882431.47535: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882431.47542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.47551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882431.47572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.47610: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.47620: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882431.47623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.47637: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882431.47644: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882431.47650: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882431.47660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.47671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882431.47682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.47689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.47695: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882431.47704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.47779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882431.47795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882431.47805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882431.47934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882431.63254: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:33:51.607761", "end": "2024-09-20 21:33:51.630531", "delta": "0:00:00.022770", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882431.64489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882431.64560: stderr chunk (state=3): >>><<< 13531 1726882431.64566: stdout chunk (state=3): >>><<< 13531 1726882431.64706: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:33:51.607761", "end": "2024-09-20 21:33:51.630531", "delta": "0:00:00.022770", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882431.64716: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882431.3496106-14398-193692478427469/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882431.64718: _low_level_execute_command(): starting 13531 1726882431.64721: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882431.3496106-14398-193692478427469/ > /dev/null 2>&1 && sleep 0' 13531 1726882431.65332: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882431.65347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.65363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882431.65393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.65436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.65449: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882431.65468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.65486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882431.65507: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882431.65520: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882431.65533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882431.65547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882431.65565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882431.65579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882431.65591: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882431.65608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882431.65693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882431.65718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882431.65742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882431.65875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882431.68281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882431.68356: stderr chunk (state=3): >>><<< 13531 1726882431.68359: stdout chunk (state=3): >>><<< 13531 1726882431.68471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882431.68474: handler run complete 13531 1726882431.68477: Evaluated conditional (False): False 13531 1726882431.68479: attempt loop complete, returning result 13531 1726882431.68481: _execute() done 13531 1726882431.68482: dumping result to json 13531 1726882431.68484: done dumping result, returning 13531 1726882431.68486: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-4fd9-519d-0000000005e7] 13531 1726882431.68488: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005e7 13531 1726882431.68654: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005e7 13531 1726882431.68656: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.022770", "end": "2024-09-20 21:33:51.630531", "rc": 0, "start": "2024-09-20 21:33:51.607761" } STDOUT: bond0 /etc/NetworkManager/system-connections/bond0.nmconnection bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 13531 1726882431.68759: no more pending results, returning what we have 13531 1726882431.68967: results queue empty 13531 1726882431.68968: checking for any_errors_fatal 13531 1726882431.68975: done checking for any_errors_fatal 13531 1726882431.68976: checking for max_fail_percentage 13531 1726882431.68978: done checking for max_fail_percentage 13531 1726882431.68979: checking to see if all hosts have failed and the running result is not ok 13531 1726882431.68980: done checking to see if all hosts have failed 13531 1726882431.68980: getting the remaining hosts for this loop 13531 1726882431.68982: done getting the remaining hosts for this loop 13531 1726882431.68985: getting the next task for host managed_node2 13531 1726882431.68992: done getting next task for host managed_node2 13531 1726882431.68994: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13531 1726882431.68999: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882431.69002: getting variables 13531 1726882431.69004: in VariableManager get_vars() 13531 1726882431.69062: Calling all_inventory to load vars for managed_node2 13531 1726882431.69068: Calling groups_inventory to load vars for managed_node2 13531 1726882431.69070: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882431.69087: Calling all_plugins_play to load vars for managed_node2 13531 1726882431.69089: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882431.69091: Calling groups_plugins_play to load vars for managed_node2 13531 1726882431.70815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882431.73775: done with get_vars() 13531 1726882431.73808: done getting variables 13531 1726882431.73878: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:51 -0400 (0:00:00.445) 0:00:19.634 ****** 13531 1726882431.73910: entering _queue_task() for managed_node2/set_fact 13531 1726882431.74713: worker is 1 (out of 1 available) 13531 1726882431.74725: exiting _queue_task() for managed_node2/set_fact 13531 1726882431.74739: done queuing things up, now waiting for results queue to drain 13531 1726882431.74740: waiting for pending results... 13531 1726882431.75308: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13531 1726882431.75427: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000005e8 13531 1726882431.75447: variable 'ansible_search_path' from source: unknown 13531 1726882431.75454: variable 'ansible_search_path' from source: unknown 13531 1726882431.75503: calling self._execute() 13531 1726882431.75607: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882431.75618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882431.75631: variable 'omit' from source: magic vars 13531 1726882431.76006: variable 'ansible_distribution_major_version' from source: facts 13531 1726882431.76031: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882431.76172: variable 'nm_profile_exists' from source: set_fact 13531 1726882431.76192: Evaluated conditional (nm_profile_exists.rc == 0): True 13531 1726882431.76201: variable 'omit' from source: magic vars 13531 1726882431.77117: variable 'omit' from source: magic vars 13531 1726882431.77151: variable 'omit' from source: magic vars 13531 1726882431.77261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882431.77361: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882431.77390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882431.77454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882431.77521: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882431.77565: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882431.77590: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882431.77644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882431.77753: Set connection var ansible_pipelining to False 13531 1726882431.77992: Set connection var ansible_timeout to 10 13531 1726882431.78003: Set connection var ansible_shell_executable to /bin/sh 13531 1726882431.78012: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882431.78096: Set connection var ansible_connection to ssh 13531 1726882431.78102: Set connection var ansible_shell_type to sh 13531 1726882431.78132: variable 'ansible_shell_executable' from source: unknown 13531 1726882431.78170: variable 'ansible_connection' from source: unknown 13531 1726882431.78177: variable 'ansible_module_compression' from source: unknown 13531 1726882431.78184: variable 'ansible_shell_type' from source: unknown 13531 1726882431.78189: variable 'ansible_shell_executable' from source: unknown 13531 1726882431.78202: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882431.78209: variable 'ansible_pipelining' from source: unknown 13531 1726882431.78215: variable 'ansible_timeout' from source: unknown 13531 1726882431.78222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882431.79286: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882431.79544: variable 'omit' from source: magic vars 13531 1726882431.79646: starting attempt loop 13531 1726882431.79654: running the handler 13531 1726882431.79677: handler run complete 13531 1726882431.79692: attempt loop complete, returning result 13531 1726882431.79754: _execute() done 13531 1726882431.79763: dumping result to json 13531 1726882431.79773: done dumping result, returning 13531 1726882431.79787: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-4fd9-519d-0000000005e8] 13531 1726882431.79798: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005e8 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13531 1726882431.80021: no more pending results, returning what we have 13531 1726882431.80025: results queue empty 13531 1726882431.80026: checking for any_errors_fatal 13531 1726882431.80037: done checking for any_errors_fatal 13531 1726882431.80038: checking for max_fail_percentage 13531 1726882431.80041: done checking for max_fail_percentage 13531 1726882431.80042: checking to see if all hosts have failed and the running result is not ok 13531 1726882431.80043: done checking to see if all hosts have failed 13531 1726882431.80044: getting the remaining hosts for this loop 13531 1726882431.80045: done getting the remaining hosts for this loop 13531 1726882431.80048: getting the next task for host managed_node2 13531 1726882431.80058: done getting next task for host managed_node2 13531 1726882431.80061: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13531 1726882431.80069: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882431.80075: getting variables 13531 1726882431.80077: in VariableManager get_vars() 13531 1726882431.80138: Calling all_inventory to load vars for managed_node2 13531 1726882431.80141: Calling groups_inventory to load vars for managed_node2 13531 1726882431.80144: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882431.80156: Calling all_plugins_play to load vars for managed_node2 13531 1726882431.80159: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882431.80163: Calling groups_plugins_play to load vars for managed_node2 13531 1726882431.81477: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005e8 13531 1726882431.81481: WORKER PROCESS EXITING 13531 1726882431.82296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882431.84094: done with get_vars() 13531 1726882431.84116: done getting variables 13531 1726882431.84181: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882431.84301: variable 'profile' from source: include params 13531 1726882431.84305: variable 'item' from source: include params 13531 1726882431.84362: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:51 -0400 (0:00:00.104) 0:00:19.739 ****** 13531 1726882431.84399: entering _queue_task() for managed_node2/command 13531 1726882431.85053: worker is 1 (out of 1 available) 13531 1726882431.85066: exiting _queue_task() for managed_node2/command 13531 1726882431.85077: done queuing things up, now waiting for results queue to drain 13531 1726882431.85078: waiting for pending results... 13531 1726882431.85623: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0 13531 1726882431.86613: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000005ea 13531 1726882431.86632: variable 'ansible_search_path' from source: unknown 13531 1726882431.86639: variable 'ansible_search_path' from source: unknown 13531 1726882431.86686: calling self._execute() 13531 1726882431.86782: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882431.86877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882431.86890: variable 'omit' from source: magic vars 13531 1726882431.87916: variable 'ansible_distribution_major_version' from source: facts 13531 1726882431.87934: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882431.88067: variable 'profile_stat' from source: set_fact 13531 1726882431.88084: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882431.88091: when evaluation is False, skipping this task 13531 1726882431.88097: _execute() done 13531 1726882431.88104: dumping result to json 13531 1726882431.88114: done dumping result, returning 13531 1726882431.88124: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0 [0e448fcc-3ce9-4fd9-519d-0000000005ea] 13531 1726882431.88135: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005ea skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882431.88288: no more pending results, returning what we have 13531 1726882431.88292: results queue empty 13531 1726882431.88293: checking for any_errors_fatal 13531 1726882431.88301: done checking for any_errors_fatal 13531 1726882431.88302: checking for max_fail_percentage 13531 1726882431.88303: done checking for max_fail_percentage 13531 1726882431.88304: checking to see if all hosts have failed and the running result is not ok 13531 1726882431.88305: done checking to see if all hosts have failed 13531 1726882431.88306: getting the remaining hosts for this loop 13531 1726882431.88307: done getting the remaining hosts for this loop 13531 1726882431.88310: getting the next task for host managed_node2 13531 1726882431.88317: done getting next task for host managed_node2 13531 1726882431.88320: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13531 1726882431.88324: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882431.88328: getting variables 13531 1726882431.88329: in VariableManager get_vars() 13531 1726882431.88387: Calling all_inventory to load vars for managed_node2 13531 1726882431.88390: Calling groups_inventory to load vars for managed_node2 13531 1726882431.88392: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882431.88405: Calling all_plugins_play to load vars for managed_node2 13531 1726882431.88408: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882431.88410: Calling groups_plugins_play to load vars for managed_node2 13531 1726882431.89596: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005ea 13531 1726882431.89600: WORKER PROCESS EXITING 13531 1726882431.90212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882431.91955: done with get_vars() 13531 1726882431.91985: done getting variables 13531 1726882431.92050: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882431.92173: variable 'profile' from source: include params 13531 1726882431.92177: variable 'item' from source: include params 13531 1726882431.92241: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:51 -0400 (0:00:00.078) 0:00:19.817 ****** 13531 1726882431.92277: entering _queue_task() for managed_node2/set_fact 13531 1726882431.92630: worker is 1 (out of 1 available) 13531 1726882431.92641: exiting _queue_task() for managed_node2/set_fact 13531 1726882431.92660: done queuing things up, now waiting for results queue to drain 13531 1726882431.92662: waiting for pending results... 13531 1726882431.92960: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 13531 1726882431.93105: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000005eb 13531 1726882431.93127: variable 'ansible_search_path' from source: unknown 13531 1726882431.93135: variable 'ansible_search_path' from source: unknown 13531 1726882431.93180: calling self._execute() 13531 1726882431.93284: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882431.93296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882431.93316: variable 'omit' from source: magic vars 13531 1726882431.93708: variable 'ansible_distribution_major_version' from source: facts 13531 1726882431.93727: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882431.93869: variable 'profile_stat' from source: set_fact 13531 1726882431.93888: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882431.93895: when evaluation is False, skipping this task 13531 1726882431.93903: _execute() done 13531 1726882431.93909: dumping result to json 13531 1726882431.93916: done dumping result, returning 13531 1726882431.93925: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0e448fcc-3ce9-4fd9-519d-0000000005eb] 13531 1726882431.93937: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005eb 13531 1726882431.94050: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005eb 13531 1726882431.94058: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882431.94125: no more pending results, returning what we have 13531 1726882431.94129: results queue empty 13531 1726882431.94130: checking for any_errors_fatal 13531 1726882431.94136: done checking for any_errors_fatal 13531 1726882431.94137: checking for max_fail_percentage 13531 1726882431.94139: done checking for max_fail_percentage 13531 1726882431.94140: checking to see if all hosts have failed and the running result is not ok 13531 1726882431.94141: done checking to see if all hosts have failed 13531 1726882431.94142: getting the remaining hosts for this loop 13531 1726882431.94143: done getting the remaining hosts for this loop 13531 1726882431.94147: getting the next task for host managed_node2 13531 1726882431.94154: done getting next task for host managed_node2 13531 1726882431.94158: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13531 1726882431.94165: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882431.94170: getting variables 13531 1726882431.94172: in VariableManager get_vars() 13531 1726882431.94237: Calling all_inventory to load vars for managed_node2 13531 1726882431.94241: Calling groups_inventory to load vars for managed_node2 13531 1726882431.94244: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882431.94258: Calling all_plugins_play to load vars for managed_node2 13531 1726882431.94262: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882431.94267: Calling groups_plugins_play to load vars for managed_node2 13531 1726882431.96116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882431.98361: done with get_vars() 13531 1726882431.99194: done getting variables 13531 1726882431.99255: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882431.99368: variable 'profile' from source: include params 13531 1726882431.99372: variable 'item' from source: include params 13531 1726882431.99432: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:51 -0400 (0:00:00.071) 0:00:19.889 ****** 13531 1726882431.99463: entering _queue_task() for managed_node2/command 13531 1726882431.99971: worker is 1 (out of 1 available) 13531 1726882431.99982: exiting _queue_task() for managed_node2/command 13531 1726882431.99994: done queuing things up, now waiting for results queue to drain 13531 1726882431.99996: waiting for pending results... 13531 1726882432.00270: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0 13531 1726882432.00405: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000005ec 13531 1726882432.00426: variable 'ansible_search_path' from source: unknown 13531 1726882432.00438: variable 'ansible_search_path' from source: unknown 13531 1726882432.00480: calling self._execute() 13531 1726882432.00576: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.00587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.00601: variable 'omit' from source: magic vars 13531 1726882432.01157: variable 'ansible_distribution_major_version' from source: facts 13531 1726882432.01321: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882432.01474: variable 'profile_stat' from source: set_fact 13531 1726882432.01538: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882432.01555: when evaluation is False, skipping this task 13531 1726882432.01639: _execute() done 13531 1726882432.01646: dumping result to json 13531 1726882432.01654: done dumping result, returning 13531 1726882432.01666: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0 [0e448fcc-3ce9-4fd9-519d-0000000005ec] 13531 1726882432.01678: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005ec skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882432.01824: no more pending results, returning what we have 13531 1726882432.01829: results queue empty 13531 1726882432.01831: checking for any_errors_fatal 13531 1726882432.01837: done checking for any_errors_fatal 13531 1726882432.01838: checking for max_fail_percentage 13531 1726882432.01840: done checking for max_fail_percentage 13531 1726882432.01841: checking to see if all hosts have failed and the running result is not ok 13531 1726882432.01842: done checking to see if all hosts have failed 13531 1726882432.01843: getting the remaining hosts for this loop 13531 1726882432.01844: done getting the remaining hosts for this loop 13531 1726882432.01848: getting the next task for host managed_node2 13531 1726882432.01856: done getting next task for host managed_node2 13531 1726882432.01859: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13531 1726882432.01867: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882432.01872: getting variables 13531 1726882432.01874: in VariableManager get_vars() 13531 1726882432.01939: Calling all_inventory to load vars for managed_node2 13531 1726882432.01943: Calling groups_inventory to load vars for managed_node2 13531 1726882432.01946: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882432.01960: Calling all_plugins_play to load vars for managed_node2 13531 1726882432.01965: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882432.01968: Calling groups_plugins_play to load vars for managed_node2 13531 1726882432.02983: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005ec 13531 1726882432.02986: WORKER PROCESS EXITING 13531 1726882432.04577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882432.07341: done with get_vars() 13531 1726882432.07378: done getting variables 13531 1726882432.07440: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882432.07562: variable 'profile' from source: include params 13531 1726882432.08534: variable 'item' from source: include params 13531 1726882432.08604: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:52 -0400 (0:00:00.091) 0:00:19.981 ****** 13531 1726882432.08638: entering _queue_task() for managed_node2/set_fact 13531 1726882432.08956: worker is 1 (out of 1 available) 13531 1726882432.08971: exiting _queue_task() for managed_node2/set_fact 13531 1726882432.08985: done queuing things up, now waiting for results queue to drain 13531 1726882432.08986: waiting for pending results... 13531 1726882432.09954: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0 13531 1726882432.10193: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000005ed 13531 1726882432.10322: variable 'ansible_search_path' from source: unknown 13531 1726882432.10350: variable 'ansible_search_path' from source: unknown 13531 1726882432.10409: calling self._execute() 13531 1726882432.10655: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.10790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.10804: variable 'omit' from source: magic vars 13531 1726882432.11338: variable 'ansible_distribution_major_version' from source: facts 13531 1726882432.11357: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882432.11487: variable 'profile_stat' from source: set_fact 13531 1726882432.11506: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882432.11514: when evaluation is False, skipping this task 13531 1726882432.11522: _execute() done 13531 1726882432.11527: dumping result to json 13531 1726882432.11535: done dumping result, returning 13531 1726882432.11547: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0 [0e448fcc-3ce9-4fd9-519d-0000000005ed] 13531 1726882432.11558: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005ed skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882432.11701: no more pending results, returning what we have 13531 1726882432.11705: results queue empty 13531 1726882432.11706: checking for any_errors_fatal 13531 1726882432.11714: done checking for any_errors_fatal 13531 1726882432.11715: checking for max_fail_percentage 13531 1726882432.11717: done checking for max_fail_percentage 13531 1726882432.11718: checking to see if all hosts have failed and the running result is not ok 13531 1726882432.11719: done checking to see if all hosts have failed 13531 1726882432.11720: getting the remaining hosts for this loop 13531 1726882432.11721: done getting the remaining hosts for this loop 13531 1726882432.11724: getting the next task for host managed_node2 13531 1726882432.11733: done getting next task for host managed_node2 13531 1726882432.11735: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13531 1726882432.11739: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882432.11743: getting variables 13531 1726882432.11745: in VariableManager get_vars() 13531 1726882432.11808: Calling all_inventory to load vars for managed_node2 13531 1726882432.11811: Calling groups_inventory to load vars for managed_node2 13531 1726882432.11814: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882432.11827: Calling all_plugins_play to load vars for managed_node2 13531 1726882432.11830: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882432.11833: Calling groups_plugins_play to load vars for managed_node2 13531 1726882432.12854: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000005ed 13531 1726882432.12858: WORKER PROCESS EXITING 13531 1726882432.13645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882432.15329: done with get_vars() 13531 1726882432.15355: done getting variables 13531 1726882432.15420: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882432.15541: variable 'profile' from source: include params 13531 1726882432.15545: variable 'item' from source: include params 13531 1726882432.15602: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:52 -0400 (0:00:00.069) 0:00:20.051 ****** 13531 1726882432.15633: entering _queue_task() for managed_node2/assert 13531 1726882432.15941: worker is 1 (out of 1 available) 13531 1726882432.15953: exiting _queue_task() for managed_node2/assert 13531 1726882432.15968: done queuing things up, now waiting for results queue to drain 13531 1726882432.15969: waiting for pending results... 13531 1726882432.16243: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0' 13531 1726882432.16358: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000356 13531 1726882432.16380: variable 'ansible_search_path' from source: unknown 13531 1726882432.16387: variable 'ansible_search_path' from source: unknown 13531 1726882432.16433: calling self._execute() 13531 1726882432.16533: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.16638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.16653: variable 'omit' from source: magic vars 13531 1726882432.17338: variable 'ansible_distribution_major_version' from source: facts 13531 1726882432.17356: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882432.17369: variable 'omit' from source: magic vars 13531 1726882432.17414: variable 'omit' from source: magic vars 13531 1726882432.17587: variable 'profile' from source: include params 13531 1726882432.17721: variable 'item' from source: include params 13531 1726882432.17783: variable 'item' from source: include params 13531 1726882432.17805: variable 'omit' from source: magic vars 13531 1726882432.17866: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882432.17958: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882432.18014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882432.18035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882432.18053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882432.18108: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882432.18117: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.18125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.18234: Set connection var ansible_pipelining to False 13531 1726882432.18244: Set connection var ansible_timeout to 10 13531 1726882432.18255: Set connection var ansible_shell_executable to /bin/sh 13531 1726882432.18270: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882432.18276: Set connection var ansible_connection to ssh 13531 1726882432.18291: Set connection var ansible_shell_type to sh 13531 1726882432.18324: variable 'ansible_shell_executable' from source: unknown 13531 1726882432.18331: variable 'ansible_connection' from source: unknown 13531 1726882432.18337: variable 'ansible_module_compression' from source: unknown 13531 1726882432.18342: variable 'ansible_shell_type' from source: unknown 13531 1726882432.18347: variable 'ansible_shell_executable' from source: unknown 13531 1726882432.18353: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.18359: variable 'ansible_pipelining' from source: unknown 13531 1726882432.18369: variable 'ansible_timeout' from source: unknown 13531 1726882432.18376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.18514: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882432.18531: variable 'omit' from source: magic vars 13531 1726882432.18542: starting attempt loop 13531 1726882432.18547: running the handler 13531 1726882432.18661: variable 'lsr_net_profile_exists' from source: set_fact 13531 1726882432.18675: Evaluated conditional (lsr_net_profile_exists): True 13531 1726882432.18685: handler run complete 13531 1726882432.18709: attempt loop complete, returning result 13531 1726882432.18716: _execute() done 13531 1726882432.18723: dumping result to json 13531 1726882432.18731: done dumping result, returning 13531 1726882432.18742: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0' [0e448fcc-3ce9-4fd9-519d-000000000356] 13531 1726882432.18752: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000356 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882432.18903: no more pending results, returning what we have 13531 1726882432.18906: results queue empty 13531 1726882432.18907: checking for any_errors_fatal 13531 1726882432.18915: done checking for any_errors_fatal 13531 1726882432.18916: checking for max_fail_percentage 13531 1726882432.18918: done checking for max_fail_percentage 13531 1726882432.18919: checking to see if all hosts have failed and the running result is not ok 13531 1726882432.18920: done checking to see if all hosts have failed 13531 1726882432.18921: getting the remaining hosts for this loop 13531 1726882432.18922: done getting the remaining hosts for this loop 13531 1726882432.18926: getting the next task for host managed_node2 13531 1726882432.18932: done getting next task for host managed_node2 13531 1726882432.18935: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13531 1726882432.18938: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882432.18943: getting variables 13531 1726882432.18945: in VariableManager get_vars() 13531 1726882432.19007: Calling all_inventory to load vars for managed_node2 13531 1726882432.19010: Calling groups_inventory to load vars for managed_node2 13531 1726882432.19012: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882432.19024: Calling all_plugins_play to load vars for managed_node2 13531 1726882432.19027: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882432.19030: Calling groups_plugins_play to load vars for managed_node2 13531 1726882432.19982: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000356 13531 1726882432.19985: WORKER PROCESS EXITING 13531 1726882432.21728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882432.23347: done with get_vars() 13531 1726882432.23373: done getting variables 13531 1726882432.23431: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882432.23543: variable 'profile' from source: include params 13531 1726882432.23546: variable 'item' from source: include params 13531 1726882432.23615: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:52 -0400 (0:00:00.080) 0:00:20.131 ****** 13531 1726882432.23742: entering _queue_task() for managed_node2/assert 13531 1726882432.24043: worker is 1 (out of 1 available) 13531 1726882432.24055: exiting _queue_task() for managed_node2/assert 13531 1726882432.24068: done queuing things up, now waiting for results queue to drain 13531 1726882432.24069: waiting for pending results... 13531 1726882432.24347: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0' 13531 1726882432.24467: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000357 13531 1726882432.24487: variable 'ansible_search_path' from source: unknown 13531 1726882432.24495: variable 'ansible_search_path' from source: unknown 13531 1726882432.24541: calling self._execute() 13531 1726882432.24636: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.24648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.24666: variable 'omit' from source: magic vars 13531 1726882432.25026: variable 'ansible_distribution_major_version' from source: facts 13531 1726882432.25043: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882432.25058: variable 'omit' from source: magic vars 13531 1726882432.25102: variable 'omit' from source: magic vars 13531 1726882432.25205: variable 'profile' from source: include params 13531 1726882432.25216: variable 'item' from source: include params 13531 1726882432.25287: variable 'item' from source: include params 13531 1726882432.25310: variable 'omit' from source: magic vars 13531 1726882432.25357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882432.25402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882432.25427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882432.25448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882432.25466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882432.25506: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882432.25514: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.25521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.25629: Set connection var ansible_pipelining to False 13531 1726882432.25641: Set connection var ansible_timeout to 10 13531 1726882432.25651: Set connection var ansible_shell_executable to /bin/sh 13531 1726882432.25661: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882432.25671: Set connection var ansible_connection to ssh 13531 1726882432.25678: Set connection var ansible_shell_type to sh 13531 1726882432.25713: variable 'ansible_shell_executable' from source: unknown 13531 1726882432.25721: variable 'ansible_connection' from source: unknown 13531 1726882432.25727: variable 'ansible_module_compression' from source: unknown 13531 1726882432.25733: variable 'ansible_shell_type' from source: unknown 13531 1726882432.25739: variable 'ansible_shell_executable' from source: unknown 13531 1726882432.25745: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.25752: variable 'ansible_pipelining' from source: unknown 13531 1726882432.25759: variable 'ansible_timeout' from source: unknown 13531 1726882432.25770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.25913: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882432.25935: variable 'omit' from source: magic vars 13531 1726882432.25946: starting attempt loop 13531 1726882432.25953: running the handler 13531 1726882432.26068: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13531 1726882432.26079: Evaluated conditional (lsr_net_profile_ansible_managed): True 13531 1726882432.26089: handler run complete 13531 1726882432.26108: attempt loop complete, returning result 13531 1726882432.26115: _execute() done 13531 1726882432.26122: dumping result to json 13531 1726882432.26130: done dumping result, returning 13531 1726882432.26145: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0' [0e448fcc-3ce9-4fd9-519d-000000000357] 13531 1726882432.26156: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000357 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882432.26302: no more pending results, returning what we have 13531 1726882432.26305: results queue empty 13531 1726882432.26306: checking for any_errors_fatal 13531 1726882432.26315: done checking for any_errors_fatal 13531 1726882432.26316: checking for max_fail_percentage 13531 1726882432.26318: done checking for max_fail_percentage 13531 1726882432.26319: checking to see if all hosts have failed and the running result is not ok 13531 1726882432.26320: done checking to see if all hosts have failed 13531 1726882432.26321: getting the remaining hosts for this loop 13531 1726882432.26322: done getting the remaining hosts for this loop 13531 1726882432.26326: getting the next task for host managed_node2 13531 1726882432.26332: done getting next task for host managed_node2 13531 1726882432.26335: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13531 1726882432.26339: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882432.26343: getting variables 13531 1726882432.26345: in VariableManager get_vars() 13531 1726882432.26408: Calling all_inventory to load vars for managed_node2 13531 1726882432.26411: Calling groups_inventory to load vars for managed_node2 13531 1726882432.26414: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882432.26426: Calling all_plugins_play to load vars for managed_node2 13531 1726882432.26429: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882432.26432: Calling groups_plugins_play to load vars for managed_node2 13531 1726882432.27384: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000357 13531 1726882432.27387: WORKER PROCESS EXITING 13531 1726882432.28169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882432.34472: done with get_vars() 13531 1726882432.34499: done getting variables 13531 1726882432.34548: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882432.34644: variable 'profile' from source: include params 13531 1726882432.34647: variable 'item' from source: include params 13531 1726882432.34707: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:52 -0400 (0:00:00.110) 0:00:20.242 ****** 13531 1726882432.34739: entering _queue_task() for managed_node2/assert 13531 1726882432.35067: worker is 1 (out of 1 available) 13531 1726882432.35078: exiting _queue_task() for managed_node2/assert 13531 1726882432.35091: done queuing things up, now waiting for results queue to drain 13531 1726882432.35093: waiting for pending results... 13531 1726882432.35380: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0 13531 1726882432.35499: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000358 13531 1726882432.35519: variable 'ansible_search_path' from source: unknown 13531 1726882432.35530: variable 'ansible_search_path' from source: unknown 13531 1726882432.35577: calling self._execute() 13531 1726882432.35681: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.35694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.35709: variable 'omit' from source: magic vars 13531 1726882432.36098: variable 'ansible_distribution_major_version' from source: facts 13531 1726882432.36116: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882432.36127: variable 'omit' from source: magic vars 13531 1726882432.36174: variable 'omit' from source: magic vars 13531 1726882432.36277: variable 'profile' from source: include params 13531 1726882432.36288: variable 'item' from source: include params 13531 1726882432.36357: variable 'item' from source: include params 13531 1726882432.36384: variable 'omit' from source: magic vars 13531 1726882432.36436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882432.36477: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882432.36503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882432.36529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882432.36546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882432.36583: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882432.36592: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.36600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.36707: Set connection var ansible_pipelining to False 13531 1726882432.36718: Set connection var ansible_timeout to 10 13531 1726882432.36732: Set connection var ansible_shell_executable to /bin/sh 13531 1726882432.36743: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882432.36750: Set connection var ansible_connection to ssh 13531 1726882432.36758: Set connection var ansible_shell_type to sh 13531 1726882432.36791: variable 'ansible_shell_executable' from source: unknown 13531 1726882432.36799: variable 'ansible_connection' from source: unknown 13531 1726882432.36807: variable 'ansible_module_compression' from source: unknown 13531 1726882432.36813: variable 'ansible_shell_type' from source: unknown 13531 1726882432.36820: variable 'ansible_shell_executable' from source: unknown 13531 1726882432.36826: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.36837: variable 'ansible_pipelining' from source: unknown 13531 1726882432.36844: variable 'ansible_timeout' from source: unknown 13531 1726882432.36852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.36991: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882432.37009: variable 'omit' from source: magic vars 13531 1726882432.37018: starting attempt loop 13531 1726882432.37024: running the handler 13531 1726882432.37137: variable 'lsr_net_profile_fingerprint' from source: set_fact 13531 1726882432.37147: Evaluated conditional (lsr_net_profile_fingerprint): True 13531 1726882432.37162: handler run complete 13531 1726882432.37184: attempt loop complete, returning result 13531 1726882432.37191: _execute() done 13531 1726882432.37198: dumping result to json 13531 1726882432.37205: done dumping result, returning 13531 1726882432.37216: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0 [0e448fcc-3ce9-4fd9-519d-000000000358] 13531 1726882432.37228: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000358 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882432.37374: no more pending results, returning what we have 13531 1726882432.37377: results queue empty 13531 1726882432.37378: checking for any_errors_fatal 13531 1726882432.37386: done checking for any_errors_fatal 13531 1726882432.37386: checking for max_fail_percentage 13531 1726882432.37388: done checking for max_fail_percentage 13531 1726882432.37389: checking to see if all hosts have failed and the running result is not ok 13531 1726882432.37390: done checking to see if all hosts have failed 13531 1726882432.37391: getting the remaining hosts for this loop 13531 1726882432.37392: done getting the remaining hosts for this loop 13531 1726882432.37396: getting the next task for host managed_node2 13531 1726882432.37406: done getting next task for host managed_node2 13531 1726882432.37410: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13531 1726882432.37413: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882432.37417: getting variables 13531 1726882432.37419: in VariableManager get_vars() 13531 1726882432.37482: Calling all_inventory to load vars for managed_node2 13531 1726882432.37486: Calling groups_inventory to load vars for managed_node2 13531 1726882432.37489: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882432.37500: Calling all_plugins_play to load vars for managed_node2 13531 1726882432.37504: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882432.37507: Calling groups_plugins_play to load vars for managed_node2 13531 1726882432.38487: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000358 13531 1726882432.38492: WORKER PROCESS EXITING 13531 1726882432.39239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882432.40890: done with get_vars() 13531 1726882432.40916: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:52 -0400 (0:00:00.062) 0:00:20.305 ****** 13531 1726882432.41014: entering _queue_task() for managed_node2/include_tasks 13531 1726882432.41326: worker is 1 (out of 1 available) 13531 1726882432.41340: exiting _queue_task() for managed_node2/include_tasks 13531 1726882432.41352: done queuing things up, now waiting for results queue to drain 13531 1726882432.41354: waiting for pending results... 13531 1726882432.41639: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 13531 1726882432.41754: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000035c 13531 1726882432.41777: variable 'ansible_search_path' from source: unknown 13531 1726882432.41784: variable 'ansible_search_path' from source: unknown 13531 1726882432.41829: calling self._execute() 13531 1726882432.41934: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.41945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.41958: variable 'omit' from source: magic vars 13531 1726882432.42338: variable 'ansible_distribution_major_version' from source: facts 13531 1726882432.42359: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882432.42374: _execute() done 13531 1726882432.42382: dumping result to json 13531 1726882432.42390: done dumping result, returning 13531 1726882432.42400: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-4fd9-519d-00000000035c] 13531 1726882432.42411: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000035c 13531 1726882432.42540: no more pending results, returning what we have 13531 1726882432.42545: in VariableManager get_vars() 13531 1726882432.42613: Calling all_inventory to load vars for managed_node2 13531 1726882432.42616: Calling groups_inventory to load vars for managed_node2 13531 1726882432.42619: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882432.42633: Calling all_plugins_play to load vars for managed_node2 13531 1726882432.42636: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882432.42640: Calling groups_plugins_play to load vars for managed_node2 13531 1726882432.43685: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000035c 13531 1726882432.43689: WORKER PROCESS EXITING 13531 1726882432.44437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882432.46790: done with get_vars() 13531 1726882432.46815: variable 'ansible_search_path' from source: unknown 13531 1726882432.46817: variable 'ansible_search_path' from source: unknown 13531 1726882432.46855: we have included files to process 13531 1726882432.46856: generating all_blocks data 13531 1726882432.46858: done generating all_blocks data 13531 1726882432.46865: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13531 1726882432.46866: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13531 1726882432.46869: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13531 1726882432.47829: done processing included file 13531 1726882432.47831: iterating over new_blocks loaded from include file 13531 1726882432.47833: in VariableManager get_vars() 13531 1726882432.47864: done with get_vars() 13531 1726882432.47867: filtering new block on tags 13531 1726882432.47892: done filtering new block on tags 13531 1726882432.47895: in VariableManager get_vars() 13531 1726882432.47922: done with get_vars() 13531 1726882432.47924: filtering new block on tags 13531 1726882432.47945: done filtering new block on tags 13531 1726882432.47947: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 13531 1726882432.47952: extending task lists for all hosts with included blocks 13531 1726882432.48128: done extending task lists 13531 1726882432.48130: done processing included files 13531 1726882432.48131: results queue empty 13531 1726882432.48132: checking for any_errors_fatal 13531 1726882432.48135: done checking for any_errors_fatal 13531 1726882432.48135: checking for max_fail_percentage 13531 1726882432.48136: done checking for max_fail_percentage 13531 1726882432.48137: checking to see if all hosts have failed and the running result is not ok 13531 1726882432.48138: done checking to see if all hosts have failed 13531 1726882432.48139: getting the remaining hosts for this loop 13531 1726882432.48140: done getting the remaining hosts for this loop 13531 1726882432.48142: getting the next task for host managed_node2 13531 1726882432.48146: done getting next task for host managed_node2 13531 1726882432.48148: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13531 1726882432.48151: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882432.48153: getting variables 13531 1726882432.48154: in VariableManager get_vars() 13531 1726882432.48174: Calling all_inventory to load vars for managed_node2 13531 1726882432.48177: Calling groups_inventory to load vars for managed_node2 13531 1726882432.48179: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882432.48184: Calling all_plugins_play to load vars for managed_node2 13531 1726882432.48186: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882432.48189: Calling groups_plugins_play to load vars for managed_node2 13531 1726882432.49568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882432.52007: done with get_vars() 13531 1726882432.52030: done getting variables 13531 1726882432.52080: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:52 -0400 (0:00:00.110) 0:00:20.416 ****** 13531 1726882432.52110: entering _queue_task() for managed_node2/set_fact 13531 1726882432.52443: worker is 1 (out of 1 available) 13531 1726882432.52454: exiting _queue_task() for managed_node2/set_fact 13531 1726882432.52470: done queuing things up, now waiting for results queue to drain 13531 1726882432.52472: waiting for pending results... 13531 1726882432.52777: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 13531 1726882432.52874: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000062c 13531 1726882432.52885: variable 'ansible_search_path' from source: unknown 13531 1726882432.52889: variable 'ansible_search_path' from source: unknown 13531 1726882432.52923: calling self._execute() 13531 1726882432.53000: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.53005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.53018: variable 'omit' from source: magic vars 13531 1726882432.53307: variable 'ansible_distribution_major_version' from source: facts 13531 1726882432.53318: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882432.53323: variable 'omit' from source: magic vars 13531 1726882432.53359: variable 'omit' from source: magic vars 13531 1726882432.53387: variable 'omit' from source: magic vars 13531 1726882432.53423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882432.53449: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882432.53481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882432.53494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882432.53504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882432.53527: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882432.53530: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.53532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.53607: Set connection var ansible_pipelining to False 13531 1726882432.53611: Set connection var ansible_timeout to 10 13531 1726882432.53616: Set connection var ansible_shell_executable to /bin/sh 13531 1726882432.53621: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882432.53623: Set connection var ansible_connection to ssh 13531 1726882432.53626: Set connection var ansible_shell_type to sh 13531 1726882432.53645: variable 'ansible_shell_executable' from source: unknown 13531 1726882432.53649: variable 'ansible_connection' from source: unknown 13531 1726882432.53652: variable 'ansible_module_compression' from source: unknown 13531 1726882432.53654: variable 'ansible_shell_type' from source: unknown 13531 1726882432.53656: variable 'ansible_shell_executable' from source: unknown 13531 1726882432.53672: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.53676: variable 'ansible_pipelining' from source: unknown 13531 1726882432.53683: variable 'ansible_timeout' from source: unknown 13531 1726882432.53698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.53857: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882432.53877: variable 'omit' from source: magic vars 13531 1726882432.53897: starting attempt loop 13531 1726882432.53912: running the handler 13531 1726882432.53931: handler run complete 13531 1726882432.53952: attempt loop complete, returning result 13531 1726882432.53960: _execute() done 13531 1726882432.53969: dumping result to json 13531 1726882432.53976: done dumping result, returning 13531 1726882432.53996: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-4fd9-519d-00000000062c] 13531 1726882432.54008: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000062c 13531 1726882432.54115: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000062c 13531 1726882432.54121: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13531 1726882432.54217: no more pending results, returning what we have 13531 1726882432.54220: results queue empty 13531 1726882432.54220: checking for any_errors_fatal 13531 1726882432.54223: done checking for any_errors_fatal 13531 1726882432.54223: checking for max_fail_percentage 13531 1726882432.54225: done checking for max_fail_percentage 13531 1726882432.54226: checking to see if all hosts have failed and the running result is not ok 13531 1726882432.54227: done checking to see if all hosts have failed 13531 1726882432.54227: getting the remaining hosts for this loop 13531 1726882432.54228: done getting the remaining hosts for this loop 13531 1726882432.54231: getting the next task for host managed_node2 13531 1726882432.54237: done getting next task for host managed_node2 13531 1726882432.54240: ^ task is: TASK: Stat profile file 13531 1726882432.54244: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882432.54248: getting variables 13531 1726882432.54250: in VariableManager get_vars() 13531 1726882432.54312: Calling all_inventory to load vars for managed_node2 13531 1726882432.54315: Calling groups_inventory to load vars for managed_node2 13531 1726882432.54317: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882432.54328: Calling all_plugins_play to load vars for managed_node2 13531 1726882432.54331: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882432.54334: Calling groups_plugins_play to load vars for managed_node2 13531 1726882432.56693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882432.58126: done with get_vars() 13531 1726882432.58147: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:52 -0400 (0:00:00.061) 0:00:20.477 ****** 13531 1726882432.58241: entering _queue_task() for managed_node2/stat 13531 1726882432.58532: worker is 1 (out of 1 available) 13531 1726882432.58545: exiting _queue_task() for managed_node2/stat 13531 1726882432.58560: done queuing things up, now waiting for results queue to drain 13531 1726882432.58611: waiting for pending results... 13531 1726882432.58807: running TaskExecutor() for managed_node2/TASK: Stat profile file 13531 1726882432.58930: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000062d 13531 1726882432.58958: variable 'ansible_search_path' from source: unknown 13531 1726882432.58967: variable 'ansible_search_path' from source: unknown 13531 1726882432.59006: calling self._execute() 13531 1726882432.59116: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.59130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.59148: variable 'omit' from source: magic vars 13531 1726882432.59643: variable 'ansible_distribution_major_version' from source: facts 13531 1726882432.59665: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882432.59678: variable 'omit' from source: magic vars 13531 1726882432.59732: variable 'omit' from source: magic vars 13531 1726882432.59841: variable 'profile' from source: include params 13531 1726882432.59850: variable 'item' from source: include params 13531 1726882432.60044: variable 'item' from source: include params 13531 1726882432.60071: variable 'omit' from source: magic vars 13531 1726882432.60119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882432.60290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882432.60316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882432.60343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882432.60485: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882432.60518: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882432.60525: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.60533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.60649: Set connection var ansible_pipelining to False 13531 1726882432.60670: Set connection var ansible_timeout to 10 13531 1726882432.60681: Set connection var ansible_shell_executable to /bin/sh 13531 1726882432.60694: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882432.60700: Set connection var ansible_connection to ssh 13531 1726882432.60708: Set connection var ansible_shell_type to sh 13531 1726882432.60739: variable 'ansible_shell_executable' from source: unknown 13531 1726882432.60747: variable 'ansible_connection' from source: unknown 13531 1726882432.60755: variable 'ansible_module_compression' from source: unknown 13531 1726882432.60762: variable 'ansible_shell_type' from source: unknown 13531 1726882432.60771: variable 'ansible_shell_executable' from source: unknown 13531 1726882432.60777: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.60784: variable 'ansible_pipelining' from source: unknown 13531 1726882432.60792: variable 'ansible_timeout' from source: unknown 13531 1726882432.60803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.61028: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882432.61042: variable 'omit' from source: magic vars 13531 1726882432.61052: starting attempt loop 13531 1726882432.61070: running the handler 13531 1726882432.61092: _low_level_execute_command(): starting 13531 1726882432.61105: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882432.62088: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882432.62092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882432.62119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882432.62122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882432.62127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882432.62201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882432.62204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882432.62207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882432.62326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882432.64014: stdout chunk (state=3): >>>/root <<< 13531 1726882432.64113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882432.64173: stderr chunk (state=3): >>><<< 13531 1726882432.64180: stdout chunk (state=3): >>><<< 13531 1726882432.64200: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882432.64213: _low_level_execute_command(): starting 13531 1726882432.64219: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882432.6420038-14446-50216691589493 `" && echo ansible-tmp-1726882432.6420038-14446-50216691589493="` echo /root/.ansible/tmp/ansible-tmp-1726882432.6420038-14446-50216691589493 `" ) && sleep 0' 13531 1726882432.64882: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882432.64886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882432.64922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882432.64934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882432.64937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882432.64939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882432.64991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882432.65004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882432.65159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882432.67061: stdout chunk (state=3): >>>ansible-tmp-1726882432.6420038-14446-50216691589493=/root/.ansible/tmp/ansible-tmp-1726882432.6420038-14446-50216691589493 <<< 13531 1726882432.67172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882432.67236: stderr chunk (state=3): >>><<< 13531 1726882432.67238: stdout chunk (state=3): >>><<< 13531 1726882432.67372: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882432.6420038-14446-50216691589493=/root/.ansible/tmp/ansible-tmp-1726882432.6420038-14446-50216691589493 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882432.67375: variable 'ansible_module_compression' from source: unknown 13531 1726882432.67377: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13531 1726882432.67379: variable 'ansible_facts' from source: unknown 13531 1726882432.67437: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882432.6420038-14446-50216691589493/AnsiballZ_stat.py 13531 1726882432.67547: Sending initial data 13531 1726882432.67550: Sent initial data (152 bytes) 13531 1726882432.68217: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882432.68222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882432.68264: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882432.68269: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882432.68286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882432.68289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882432.68341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882432.68350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882432.68353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882432.68452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882432.70244: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882432.70337: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882432.70437: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpq29jzg2f /root/.ansible/tmp/ansible-tmp-1726882432.6420038-14446-50216691589493/AnsiballZ_stat.py <<< 13531 1726882432.70532: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882432.71545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882432.71648: stderr chunk (state=3): >>><<< 13531 1726882432.71651: stdout chunk (state=3): >>><<< 13531 1726882432.71671: done transferring module to remote 13531 1726882432.71680: _low_level_execute_command(): starting 13531 1726882432.71689: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882432.6420038-14446-50216691589493/ /root/.ansible/tmp/ansible-tmp-1726882432.6420038-14446-50216691589493/AnsiballZ_stat.py && sleep 0' 13531 1726882432.72143: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882432.72148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882432.72182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882432.72194: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882432.72244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882432.72259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882432.72371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882432.74157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882432.74214: stderr chunk (state=3): >>><<< 13531 1726882432.74217: stdout chunk (state=3): >>><<< 13531 1726882432.74231: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882432.74234: _low_level_execute_command(): starting 13531 1726882432.74239: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882432.6420038-14446-50216691589493/AnsiballZ_stat.py && sleep 0' 13531 1726882432.74696: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882432.74701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882432.74737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882432.74742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 13531 1726882432.74752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882432.74768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882432.74808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882432.74820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882432.74831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882432.74948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882432.88059: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13531 1726882432.89071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882432.89126: stderr chunk (state=3): >>><<< 13531 1726882432.89129: stdout chunk (state=3): >>><<< 13531 1726882432.89145: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882432.89172: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882432.6420038-14446-50216691589493/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882432.89180: _low_level_execute_command(): starting 13531 1726882432.89184: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882432.6420038-14446-50216691589493/ > /dev/null 2>&1 && sleep 0' 13531 1726882432.89665: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882432.89669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882432.89706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882432.89709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882432.89711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882432.89713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882432.89765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882432.89769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882432.89887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882432.91817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882432.91820: stdout chunk (state=3): >>><<< 13531 1726882432.91822: stderr chunk (state=3): >>><<< 13531 1726882432.91874: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882432.91877: handler run complete 13531 1726882432.91880: attempt loop complete, returning result 13531 1726882432.91982: _execute() done 13531 1726882432.91985: dumping result to json 13531 1726882432.91987: done dumping result, returning 13531 1726882432.91990: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-4fd9-519d-00000000062d] 13531 1726882432.91992: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000062d ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 13531 1726882432.92119: no more pending results, returning what we have 13531 1726882432.92122: results queue empty 13531 1726882432.92123: checking for any_errors_fatal 13531 1726882432.92129: done checking for any_errors_fatal 13531 1726882432.92129: checking for max_fail_percentage 13531 1726882432.92131: done checking for max_fail_percentage 13531 1726882432.92131: checking to see if all hosts have failed and the running result is not ok 13531 1726882432.92132: done checking to see if all hosts have failed 13531 1726882432.92133: getting the remaining hosts for this loop 13531 1726882432.92134: done getting the remaining hosts for this loop 13531 1726882432.92137: getting the next task for host managed_node2 13531 1726882432.92143: done getting next task for host managed_node2 13531 1726882432.92147: ^ task is: TASK: Set NM profile exist flag based on the profile files 13531 1726882432.92150: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882432.92157: getting variables 13531 1726882432.92158: in VariableManager get_vars() 13531 1726882432.92220: Calling all_inventory to load vars for managed_node2 13531 1726882432.92223: Calling groups_inventory to load vars for managed_node2 13531 1726882432.92225: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882432.92238: Calling all_plugins_play to load vars for managed_node2 13531 1726882432.92241: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882432.92244: Calling groups_plugins_play to load vars for managed_node2 13531 1726882432.92765: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000062d 13531 1726882432.92771: WORKER PROCESS EXITING 13531 1726882432.93736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882432.94675: done with get_vars() 13531 1726882432.94697: done getting variables 13531 1726882432.94742: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:52 -0400 (0:00:00.365) 0:00:20.842 ****** 13531 1726882432.94768: entering _queue_task() for managed_node2/set_fact 13531 1726882432.95013: worker is 1 (out of 1 available) 13531 1726882432.95026: exiting _queue_task() for managed_node2/set_fact 13531 1726882432.95039: done queuing things up, now waiting for results queue to drain 13531 1726882432.95041: waiting for pending results... 13531 1726882432.95580: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 13531 1726882432.95585: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000062e 13531 1726882432.95590: variable 'ansible_search_path' from source: unknown 13531 1726882432.95593: variable 'ansible_search_path' from source: unknown 13531 1726882432.95596: calling self._execute() 13531 1726882432.95599: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882432.95601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882432.95605: variable 'omit' from source: magic vars 13531 1726882432.95973: variable 'ansible_distribution_major_version' from source: facts 13531 1726882432.96073: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882432.96085: variable 'profile_stat' from source: set_fact 13531 1726882432.96106: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882432.96110: when evaluation is False, skipping this task 13531 1726882432.96113: _execute() done 13531 1726882432.96115: dumping result to json 13531 1726882432.96117: done dumping result, returning 13531 1726882432.96122: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-4fd9-519d-00000000062e] 13531 1726882432.96129: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000062e 13531 1726882432.96232: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000062e 13531 1726882432.96236: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882432.96287: no more pending results, returning what we have 13531 1726882432.96292: results queue empty 13531 1726882432.96292: checking for any_errors_fatal 13531 1726882432.96302: done checking for any_errors_fatal 13531 1726882432.96303: checking for max_fail_percentage 13531 1726882432.96304: done checking for max_fail_percentage 13531 1726882432.96305: checking to see if all hosts have failed and the running result is not ok 13531 1726882432.96306: done checking to see if all hosts have failed 13531 1726882432.96307: getting the remaining hosts for this loop 13531 1726882432.96308: done getting the remaining hosts for this loop 13531 1726882432.96312: getting the next task for host managed_node2 13531 1726882432.96317: done getting next task for host managed_node2 13531 1726882432.96320: ^ task is: TASK: Get NM profile info 13531 1726882432.96323: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882432.96327: getting variables 13531 1726882432.96329: in VariableManager get_vars() 13531 1726882432.96393: Calling all_inventory to load vars for managed_node2 13531 1726882432.96397: Calling groups_inventory to load vars for managed_node2 13531 1726882432.96399: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882432.96415: Calling all_plugins_play to load vars for managed_node2 13531 1726882432.96418: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882432.96422: Calling groups_plugins_play to load vars for managed_node2 13531 1726882432.98237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882432.99851: done with get_vars() 13531 1726882432.99890: done getting variables 13531 1726882432.99951: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:52 -0400 (0:00:00.052) 0:00:20.895 ****** 13531 1726882432.99986: entering _queue_task() for managed_node2/shell 13531 1726882433.00328: worker is 1 (out of 1 available) 13531 1726882433.00340: exiting _queue_task() for managed_node2/shell 13531 1726882433.00355: done queuing things up, now waiting for results queue to drain 13531 1726882433.00357: waiting for pending results... 13531 1726882433.00653: running TaskExecutor() for managed_node2/TASK: Get NM profile info 13531 1726882433.00755: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000062f 13531 1726882433.00775: variable 'ansible_search_path' from source: unknown 13531 1726882433.00779: variable 'ansible_search_path' from source: unknown 13531 1726882433.00821: calling self._execute() 13531 1726882433.00924: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.00928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.00939: variable 'omit' from source: magic vars 13531 1726882433.01328: variable 'ansible_distribution_major_version' from source: facts 13531 1726882433.01344: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882433.01350: variable 'omit' from source: magic vars 13531 1726882433.01406: variable 'omit' from source: magic vars 13531 1726882433.01523: variable 'profile' from source: include params 13531 1726882433.01527: variable 'item' from source: include params 13531 1726882433.01601: variable 'item' from source: include params 13531 1726882433.01620: variable 'omit' from source: magic vars 13531 1726882433.01674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882433.01714: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882433.01735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882433.01751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882433.01767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882433.01802: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882433.01805: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.01807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.01914: Set connection var ansible_pipelining to False 13531 1726882433.01919: Set connection var ansible_timeout to 10 13531 1726882433.01924: Set connection var ansible_shell_executable to /bin/sh 13531 1726882433.01930: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882433.01933: Set connection var ansible_connection to ssh 13531 1726882433.01935: Set connection var ansible_shell_type to sh 13531 1726882433.01966: variable 'ansible_shell_executable' from source: unknown 13531 1726882433.01970: variable 'ansible_connection' from source: unknown 13531 1726882433.01972: variable 'ansible_module_compression' from source: unknown 13531 1726882433.01975: variable 'ansible_shell_type' from source: unknown 13531 1726882433.01977: variable 'ansible_shell_executable' from source: unknown 13531 1726882433.01979: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.01981: variable 'ansible_pipelining' from source: unknown 13531 1726882433.01984: variable 'ansible_timeout' from source: unknown 13531 1726882433.01986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.02132: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882433.02142: variable 'omit' from source: magic vars 13531 1726882433.02148: starting attempt loop 13531 1726882433.02150: running the handler 13531 1726882433.02165: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882433.02183: _low_level_execute_command(): starting 13531 1726882433.02190: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882433.03026: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882433.03037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.03048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.03070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.03117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.03125: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882433.03134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.03148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882433.03160: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882433.03170: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882433.03178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.03188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.03201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.03210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.03217: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882433.03227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.03307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882433.03324: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882433.03332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882433.03476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882433.05140: stdout chunk (state=3): >>>/root <<< 13531 1726882433.05239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882433.05338: stderr chunk (state=3): >>><<< 13531 1726882433.05342: stdout chunk (state=3): >>><<< 13531 1726882433.05374: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882433.05387: _low_level_execute_command(): starting 13531 1726882433.05395: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882433.0537398-14466-201254141040892 `" && echo ansible-tmp-1726882433.0537398-14466-201254141040892="` echo /root/.ansible/tmp/ansible-tmp-1726882433.0537398-14466-201254141040892 `" ) && sleep 0' 13531 1726882433.06101: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882433.06110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.06120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.06136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.06186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.06193: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882433.06203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.06216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882433.06224: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882433.06230: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882433.06238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.06249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.06271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.06276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.06283: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882433.06293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.06369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882433.06389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882433.06397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882433.06529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882433.08413: stdout chunk (state=3): >>>ansible-tmp-1726882433.0537398-14466-201254141040892=/root/.ansible/tmp/ansible-tmp-1726882433.0537398-14466-201254141040892 <<< 13531 1726882433.08520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882433.08624: stderr chunk (state=3): >>><<< 13531 1726882433.08636: stdout chunk (state=3): >>><<< 13531 1726882433.08869: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882433.0537398-14466-201254141040892=/root/.ansible/tmp/ansible-tmp-1726882433.0537398-14466-201254141040892 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882433.08873: variable 'ansible_module_compression' from source: unknown 13531 1726882433.08875: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13531 1726882433.08878: variable 'ansible_facts' from source: unknown 13531 1726882433.08901: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882433.0537398-14466-201254141040892/AnsiballZ_command.py 13531 1726882433.09067: Sending initial data 13531 1726882433.09073: Sent initial data (156 bytes) 13531 1726882433.10074: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882433.10084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.10094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.10107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.10147: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.10154: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882433.10170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.10184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882433.10192: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882433.10203: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882433.10206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.10217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.10228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.10235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.10242: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882433.10252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.10326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882433.10342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882433.10345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882433.10490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882433.12271: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882433.12361: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882433.12450: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpwkllc5ld /root/.ansible/tmp/ansible-tmp-1726882433.0537398-14466-201254141040892/AnsiballZ_command.py <<< 13531 1726882433.12545: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882433.13847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882433.14015: stderr chunk (state=3): >>><<< 13531 1726882433.14019: stdout chunk (state=3): >>><<< 13531 1726882433.14044: done transferring module to remote 13531 1726882433.14060: _low_level_execute_command(): starting 13531 1726882433.14067: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882433.0537398-14466-201254141040892/ /root/.ansible/tmp/ansible-tmp-1726882433.0537398-14466-201254141040892/AnsiballZ_command.py && sleep 0' 13531 1726882433.14729: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882433.14739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.14749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.14768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.14806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.14813: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882433.14823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.14837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882433.14844: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882433.14851: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882433.14862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.14875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.14886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.14893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.14900: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882433.14909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.14983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882433.15002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882433.15015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882433.15139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882433.16990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882433.17025: stderr chunk (state=3): >>><<< 13531 1726882433.17028: stdout chunk (state=3): >>><<< 13531 1726882433.17132: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882433.17136: _low_level_execute_command(): starting 13531 1726882433.17139: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882433.0537398-14466-201254141040892/AnsiballZ_command.py && sleep 0' 13531 1726882433.17754: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882433.17772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.17792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.17812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.17856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.17871: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882433.17889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.17906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882433.17917: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882433.17926: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882433.17937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.17948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.17964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.17976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.17986: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882433.18003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.18080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882433.18103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882433.18124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882433.18261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882433.34026: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:33:53.315094", "end": "2024-09-20 21:33:53.338361", "delta": "0:00:00.023267", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882433.35300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882433.35359: stderr chunk (state=3): >>><<< 13531 1726882433.35365: stdout chunk (state=3): >>><<< 13531 1726882433.35379: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:33:53.315094", "end": "2024-09-20 21:33:53.338361", "delta": "0:00:00.023267", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882433.35408: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882433.0537398-14466-201254141040892/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882433.35414: _low_level_execute_command(): starting 13531 1726882433.35419: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882433.0537398-14466-201254141040892/ > /dev/null 2>&1 && sleep 0' 13531 1726882433.35894: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882433.35898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.35901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.35935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.35938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.35940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.35987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882433.36007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882433.36010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882433.36116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882433.37929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882433.37988: stderr chunk (state=3): >>><<< 13531 1726882433.37991: stdout chunk (state=3): >>><<< 13531 1726882433.38048: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882433.38051: handler run complete 13531 1726882433.38054: Evaluated conditional (False): False 13531 1726882433.38056: attempt loop complete, returning result 13531 1726882433.38057: _execute() done 13531 1726882433.38059: dumping result to json 13531 1726882433.38061: done dumping result, returning 13531 1726882433.38063: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-4fd9-519d-00000000062f] 13531 1726882433.38071: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000062f 13531 1726882433.38153: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000062f 13531 1726882433.38156: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.023267", "end": "2024-09-20 21:33:53.338361", "rc": 0, "start": "2024-09-20 21:33:53.315094" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 13531 1726882433.38242: no more pending results, returning what we have 13531 1726882433.38246: results queue empty 13531 1726882433.38247: checking for any_errors_fatal 13531 1726882433.38254: done checking for any_errors_fatal 13531 1726882433.38255: checking for max_fail_percentage 13531 1726882433.38257: done checking for max_fail_percentage 13531 1726882433.38258: checking to see if all hosts have failed and the running result is not ok 13531 1726882433.38259: done checking to see if all hosts have failed 13531 1726882433.38260: getting the remaining hosts for this loop 13531 1726882433.38261: done getting the remaining hosts for this loop 13531 1726882433.38266: getting the next task for host managed_node2 13531 1726882433.38272: done getting next task for host managed_node2 13531 1726882433.38274: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13531 1726882433.38278: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882433.38281: getting variables 13531 1726882433.38284: in VariableManager get_vars() 13531 1726882433.38331: Calling all_inventory to load vars for managed_node2 13531 1726882433.38334: Calling groups_inventory to load vars for managed_node2 13531 1726882433.38336: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882433.38346: Calling all_plugins_play to load vars for managed_node2 13531 1726882433.38349: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882433.38351: Calling groups_plugins_play to load vars for managed_node2 13531 1726882433.39312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882433.40830: done with get_vars() 13531 1726882433.40855: done getting variables 13531 1726882433.40921: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:53 -0400 (0:00:00.409) 0:00:21.304 ****** 13531 1726882433.40955: entering _queue_task() for managed_node2/set_fact 13531 1726882433.41253: worker is 1 (out of 1 available) 13531 1726882433.41285: exiting _queue_task() for managed_node2/set_fact 13531 1726882433.41298: done queuing things up, now waiting for results queue to drain 13531 1726882433.41299: waiting for pending results... 13531 1726882433.41510: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13531 1726882433.41618: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000630 13531 1726882433.41640: variable 'ansible_search_path' from source: unknown 13531 1726882433.41655: variable 'ansible_search_path' from source: unknown 13531 1726882433.41698: calling self._execute() 13531 1726882433.41807: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.41819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.41833: variable 'omit' from source: magic vars 13531 1726882433.42247: variable 'ansible_distribution_major_version' from source: facts 13531 1726882433.42268: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882433.42424: variable 'nm_profile_exists' from source: set_fact 13531 1726882433.42444: Evaluated conditional (nm_profile_exists.rc == 0): True 13531 1726882433.42456: variable 'omit' from source: magic vars 13531 1726882433.42505: variable 'omit' from source: magic vars 13531 1726882433.42555: variable 'omit' from source: magic vars 13531 1726882433.42606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882433.42654: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882433.42683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882433.42704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882433.42719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882433.42762: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882433.42774: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.42782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.42881: Set connection var ansible_pipelining to False 13531 1726882433.42886: Set connection var ansible_timeout to 10 13531 1726882433.42891: Set connection var ansible_shell_executable to /bin/sh 13531 1726882433.42903: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882433.42906: Set connection var ansible_connection to ssh 13531 1726882433.42912: Set connection var ansible_shell_type to sh 13531 1726882433.42944: variable 'ansible_shell_executable' from source: unknown 13531 1726882433.42947: variable 'ansible_connection' from source: unknown 13531 1726882433.42949: variable 'ansible_module_compression' from source: unknown 13531 1726882433.42955: variable 'ansible_shell_type' from source: unknown 13531 1726882433.42958: variable 'ansible_shell_executable' from source: unknown 13531 1726882433.42966: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.42971: variable 'ansible_pipelining' from source: unknown 13531 1726882433.42974: variable 'ansible_timeout' from source: unknown 13531 1726882433.42978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.43084: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882433.43097: variable 'omit' from source: magic vars 13531 1726882433.43102: starting attempt loop 13531 1726882433.43106: running the handler 13531 1726882433.43116: handler run complete 13531 1726882433.43124: attempt loop complete, returning result 13531 1726882433.43126: _execute() done 13531 1726882433.43129: dumping result to json 13531 1726882433.43131: done dumping result, returning 13531 1726882433.43138: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-4fd9-519d-000000000630] 13531 1726882433.43144: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000630 13531 1726882433.43228: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000630 13531 1726882433.43231: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13531 1726882433.43287: no more pending results, returning what we have 13531 1726882433.43290: results queue empty 13531 1726882433.43290: checking for any_errors_fatal 13531 1726882433.43300: done checking for any_errors_fatal 13531 1726882433.43301: checking for max_fail_percentage 13531 1726882433.43302: done checking for max_fail_percentage 13531 1726882433.43303: checking to see if all hosts have failed and the running result is not ok 13531 1726882433.43304: done checking to see if all hosts have failed 13531 1726882433.43304: getting the remaining hosts for this loop 13531 1726882433.43306: done getting the remaining hosts for this loop 13531 1726882433.43309: getting the next task for host managed_node2 13531 1726882433.43319: done getting next task for host managed_node2 13531 1726882433.43321: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13531 1726882433.43325: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882433.43328: getting variables 13531 1726882433.43329: in VariableManager get_vars() 13531 1726882433.43383: Calling all_inventory to load vars for managed_node2 13531 1726882433.43386: Calling groups_inventory to load vars for managed_node2 13531 1726882433.43388: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882433.43398: Calling all_plugins_play to load vars for managed_node2 13531 1726882433.43400: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882433.43403: Calling groups_plugins_play to load vars for managed_node2 13531 1726882433.44234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882433.45901: done with get_vars() 13531 1726882433.45931: done getting variables 13531 1726882433.45983: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882433.46076: variable 'profile' from source: include params 13531 1726882433.46079: variable 'item' from source: include params 13531 1726882433.46126: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:53 -0400 (0:00:00.051) 0:00:21.356 ****** 13531 1726882433.46155: entering _queue_task() for managed_node2/command 13531 1726882433.46386: worker is 1 (out of 1 available) 13531 1726882433.46398: exiting _queue_task() for managed_node2/command 13531 1726882433.46410: done queuing things up, now waiting for results queue to drain 13531 1726882433.46411: waiting for pending results... 13531 1726882433.46596: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 13531 1726882433.46674: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000632 13531 1726882433.46689: variable 'ansible_search_path' from source: unknown 13531 1726882433.46692: variable 'ansible_search_path' from source: unknown 13531 1726882433.46718: calling self._execute() 13531 1726882433.46801: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.46806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.46814: variable 'omit' from source: magic vars 13531 1726882433.47098: variable 'ansible_distribution_major_version' from source: facts 13531 1726882433.47108: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882433.47196: variable 'profile_stat' from source: set_fact 13531 1726882433.47207: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882433.47210: when evaluation is False, skipping this task 13531 1726882433.47213: _execute() done 13531 1726882433.47215: dumping result to json 13531 1726882433.47218: done dumping result, returning 13531 1726882433.47225: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0e448fcc-3ce9-4fd9-519d-000000000632] 13531 1726882433.47230: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000632 13531 1726882433.47313: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000632 13531 1726882433.47315: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882433.47365: no more pending results, returning what we have 13531 1726882433.47369: results queue empty 13531 1726882433.47370: checking for any_errors_fatal 13531 1726882433.47377: done checking for any_errors_fatal 13531 1726882433.47377: checking for max_fail_percentage 13531 1726882433.47379: done checking for max_fail_percentage 13531 1726882433.47380: checking to see if all hosts have failed and the running result is not ok 13531 1726882433.47381: done checking to see if all hosts have failed 13531 1726882433.47382: getting the remaining hosts for this loop 13531 1726882433.47383: done getting the remaining hosts for this loop 13531 1726882433.47386: getting the next task for host managed_node2 13531 1726882433.47392: done getting next task for host managed_node2 13531 1726882433.47394: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13531 1726882433.47398: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882433.47402: getting variables 13531 1726882433.47404: in VariableManager get_vars() 13531 1726882433.47455: Calling all_inventory to load vars for managed_node2 13531 1726882433.47458: Calling groups_inventory to load vars for managed_node2 13531 1726882433.47460: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882433.47472: Calling all_plugins_play to load vars for managed_node2 13531 1726882433.47474: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882433.47477: Calling groups_plugins_play to load vars for managed_node2 13531 1726882433.48314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882433.49252: done with get_vars() 13531 1726882433.49271: done getting variables 13531 1726882433.49313: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882433.49393: variable 'profile' from source: include params 13531 1726882433.49396: variable 'item' from source: include params 13531 1726882433.49434: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:53 -0400 (0:00:00.032) 0:00:21.389 ****** 13531 1726882433.49456: entering _queue_task() for managed_node2/set_fact 13531 1726882433.49667: worker is 1 (out of 1 available) 13531 1726882433.49680: exiting _queue_task() for managed_node2/set_fact 13531 1726882433.49691: done queuing things up, now waiting for results queue to drain 13531 1726882433.49693: waiting for pending results... 13531 1726882433.49872: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 13531 1726882433.49950: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000633 13531 1726882433.49964: variable 'ansible_search_path' from source: unknown 13531 1726882433.49968: variable 'ansible_search_path' from source: unknown 13531 1726882433.50013: calling self._execute() 13531 1726882433.50090: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.50093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.50101: variable 'omit' from source: magic vars 13531 1726882433.50366: variable 'ansible_distribution_major_version' from source: facts 13531 1726882433.50379: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882433.50461: variable 'profile_stat' from source: set_fact 13531 1726882433.50478: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882433.50482: when evaluation is False, skipping this task 13531 1726882433.50485: _execute() done 13531 1726882433.50488: dumping result to json 13531 1726882433.50490: done dumping result, returning 13531 1726882433.50493: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0e448fcc-3ce9-4fd9-519d-000000000633] 13531 1726882433.50500: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000633 13531 1726882433.50584: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000633 13531 1726882433.50587: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882433.50631: no more pending results, returning what we have 13531 1726882433.50634: results queue empty 13531 1726882433.50635: checking for any_errors_fatal 13531 1726882433.50643: done checking for any_errors_fatal 13531 1726882433.50643: checking for max_fail_percentage 13531 1726882433.50645: done checking for max_fail_percentage 13531 1726882433.50646: checking to see if all hosts have failed and the running result is not ok 13531 1726882433.50647: done checking to see if all hosts have failed 13531 1726882433.50647: getting the remaining hosts for this loop 13531 1726882433.50649: done getting the remaining hosts for this loop 13531 1726882433.50652: getting the next task for host managed_node2 13531 1726882433.50657: done getting next task for host managed_node2 13531 1726882433.50661: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13531 1726882433.50667: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882433.50671: getting variables 13531 1726882433.50672: in VariableManager get_vars() 13531 1726882433.50722: Calling all_inventory to load vars for managed_node2 13531 1726882433.50725: Calling groups_inventory to load vars for managed_node2 13531 1726882433.50726: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882433.50737: Calling all_plugins_play to load vars for managed_node2 13531 1726882433.50739: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882433.50741: Calling groups_plugins_play to load vars for managed_node2 13531 1726882433.51652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882433.52583: done with get_vars() 13531 1726882433.52601: done getting variables 13531 1726882433.52646: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882433.52729: variable 'profile' from source: include params 13531 1726882433.52733: variable 'item' from source: include params 13531 1726882433.52775: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:53 -0400 (0:00:00.033) 0:00:21.423 ****** 13531 1726882433.52797: entering _queue_task() for managed_node2/command 13531 1726882433.53030: worker is 1 (out of 1 available) 13531 1726882433.53044: exiting _queue_task() for managed_node2/command 13531 1726882433.53056: done queuing things up, now waiting for results queue to drain 13531 1726882433.53057: waiting for pending results... 13531 1726882433.53239: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 13531 1726882433.53319: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000634 13531 1726882433.53330: variable 'ansible_search_path' from source: unknown 13531 1726882433.53333: variable 'ansible_search_path' from source: unknown 13531 1726882433.53367: calling self._execute() 13531 1726882433.53439: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.53443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.53452: variable 'omit' from source: magic vars 13531 1726882433.53719: variable 'ansible_distribution_major_version' from source: facts 13531 1726882433.53731: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882433.53814: variable 'profile_stat' from source: set_fact 13531 1726882433.53827: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882433.53832: when evaluation is False, skipping this task 13531 1726882433.53834: _execute() done 13531 1726882433.53837: dumping result to json 13531 1726882433.53839: done dumping result, returning 13531 1726882433.53843: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0e448fcc-3ce9-4fd9-519d-000000000634] 13531 1726882433.53850: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000634 13531 1726882433.53933: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000634 13531 1726882433.53935: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882433.53989: no more pending results, returning what we have 13531 1726882433.53992: results queue empty 13531 1726882433.53993: checking for any_errors_fatal 13531 1726882433.53999: done checking for any_errors_fatal 13531 1726882433.53999: checking for max_fail_percentage 13531 1726882433.54001: done checking for max_fail_percentage 13531 1726882433.54002: checking to see if all hosts have failed and the running result is not ok 13531 1726882433.54002: done checking to see if all hosts have failed 13531 1726882433.54003: getting the remaining hosts for this loop 13531 1726882433.54004: done getting the remaining hosts for this loop 13531 1726882433.54007: getting the next task for host managed_node2 13531 1726882433.54014: done getting next task for host managed_node2 13531 1726882433.54017: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13531 1726882433.54020: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882433.54024: getting variables 13531 1726882433.54025: in VariableManager get_vars() 13531 1726882433.54081: Calling all_inventory to load vars for managed_node2 13531 1726882433.54084: Calling groups_inventory to load vars for managed_node2 13531 1726882433.54086: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882433.54096: Calling all_plugins_play to load vars for managed_node2 13531 1726882433.54098: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882433.54101: Calling groups_plugins_play to load vars for managed_node2 13531 1726882433.54911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882433.55952: done with get_vars() 13531 1726882433.55971: done getting variables 13531 1726882433.56016: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882433.56097: variable 'profile' from source: include params 13531 1726882433.56100: variable 'item' from source: include params 13531 1726882433.56141: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:53 -0400 (0:00:00.033) 0:00:21.456 ****** 13531 1726882433.56165: entering _queue_task() for managed_node2/set_fact 13531 1726882433.56404: worker is 1 (out of 1 available) 13531 1726882433.56416: exiting _queue_task() for managed_node2/set_fact 13531 1726882433.56428: done queuing things up, now waiting for results queue to drain 13531 1726882433.56430: waiting for pending results... 13531 1726882433.56610: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 13531 1726882433.56689: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000635 13531 1726882433.56701: variable 'ansible_search_path' from source: unknown 13531 1726882433.56704: variable 'ansible_search_path' from source: unknown 13531 1726882433.56734: calling self._execute() 13531 1726882433.56813: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.56817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.56825: variable 'omit' from source: magic vars 13531 1726882433.57095: variable 'ansible_distribution_major_version' from source: facts 13531 1726882433.57105: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882433.57188: variable 'profile_stat' from source: set_fact 13531 1726882433.57200: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882433.57203: when evaluation is False, skipping this task 13531 1726882433.57207: _execute() done 13531 1726882433.57210: dumping result to json 13531 1726882433.57212: done dumping result, returning 13531 1726882433.57216: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0e448fcc-3ce9-4fd9-519d-000000000635] 13531 1726882433.57227: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000635 13531 1726882433.57311: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000635 13531 1726882433.57314: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882433.57369: no more pending results, returning what we have 13531 1726882433.57373: results queue empty 13531 1726882433.57374: checking for any_errors_fatal 13531 1726882433.57380: done checking for any_errors_fatal 13531 1726882433.57381: checking for max_fail_percentage 13531 1726882433.57382: done checking for max_fail_percentage 13531 1726882433.57383: checking to see if all hosts have failed and the running result is not ok 13531 1726882433.57384: done checking to see if all hosts have failed 13531 1726882433.57384: getting the remaining hosts for this loop 13531 1726882433.57386: done getting the remaining hosts for this loop 13531 1726882433.57389: getting the next task for host managed_node2 13531 1726882433.57397: done getting next task for host managed_node2 13531 1726882433.57400: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13531 1726882433.57403: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882433.57407: getting variables 13531 1726882433.57409: in VariableManager get_vars() 13531 1726882433.57465: Calling all_inventory to load vars for managed_node2 13531 1726882433.57468: Calling groups_inventory to load vars for managed_node2 13531 1726882433.57470: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882433.57480: Calling all_plugins_play to load vars for managed_node2 13531 1726882433.57483: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882433.57485: Calling groups_plugins_play to load vars for managed_node2 13531 1726882433.58314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882433.59261: done with get_vars() 13531 1726882433.59282: done getting variables 13531 1726882433.59325: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882433.59413: variable 'profile' from source: include params 13531 1726882433.59416: variable 'item' from source: include params 13531 1726882433.59456: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:53 -0400 (0:00:00.033) 0:00:21.490 ****** 13531 1726882433.59484: entering _queue_task() for managed_node2/assert 13531 1726882433.59718: worker is 1 (out of 1 available) 13531 1726882433.59732: exiting _queue_task() for managed_node2/assert 13531 1726882433.59745: done queuing things up, now waiting for results queue to drain 13531 1726882433.59746: waiting for pending results... 13531 1726882433.59929: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.0' 13531 1726882433.59997: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000035d 13531 1726882433.60010: variable 'ansible_search_path' from source: unknown 13531 1726882433.60013: variable 'ansible_search_path' from source: unknown 13531 1726882433.60046: calling self._execute() 13531 1726882433.60116: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.60119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.60129: variable 'omit' from source: magic vars 13531 1726882433.60393: variable 'ansible_distribution_major_version' from source: facts 13531 1726882433.60404: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882433.60413: variable 'omit' from source: magic vars 13531 1726882433.60439: variable 'omit' from source: magic vars 13531 1726882433.60514: variable 'profile' from source: include params 13531 1726882433.60520: variable 'item' from source: include params 13531 1726882433.60568: variable 'item' from source: include params 13531 1726882433.60585: variable 'omit' from source: magic vars 13531 1726882433.60620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882433.60649: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882433.60669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882433.60685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882433.60696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882433.60719: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882433.60722: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.60724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.60801: Set connection var ansible_pipelining to False 13531 1726882433.60806: Set connection var ansible_timeout to 10 13531 1726882433.60811: Set connection var ansible_shell_executable to /bin/sh 13531 1726882433.60818: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882433.60821: Set connection var ansible_connection to ssh 13531 1726882433.60823: Set connection var ansible_shell_type to sh 13531 1726882433.60845: variable 'ansible_shell_executable' from source: unknown 13531 1726882433.60848: variable 'ansible_connection' from source: unknown 13531 1726882433.60850: variable 'ansible_module_compression' from source: unknown 13531 1726882433.60852: variable 'ansible_shell_type' from source: unknown 13531 1726882433.60857: variable 'ansible_shell_executable' from source: unknown 13531 1726882433.60859: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.60861: variable 'ansible_pipelining' from source: unknown 13531 1726882433.60866: variable 'ansible_timeout' from source: unknown 13531 1726882433.60871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.60974: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882433.60983: variable 'omit' from source: magic vars 13531 1726882433.60988: starting attempt loop 13531 1726882433.60990: running the handler 13531 1726882433.61065: variable 'lsr_net_profile_exists' from source: set_fact 13531 1726882433.61072: Evaluated conditional (lsr_net_profile_exists): True 13531 1726882433.61078: handler run complete 13531 1726882433.61090: attempt loop complete, returning result 13531 1726882433.61092: _execute() done 13531 1726882433.61095: dumping result to json 13531 1726882433.61098: done dumping result, returning 13531 1726882433.61106: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.0' [0e448fcc-3ce9-4fd9-519d-00000000035d] 13531 1726882433.61108: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000035d 13531 1726882433.61194: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000035d 13531 1726882433.61197: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882433.61248: no more pending results, returning what we have 13531 1726882433.61252: results queue empty 13531 1726882433.61255: checking for any_errors_fatal 13531 1726882433.61262: done checking for any_errors_fatal 13531 1726882433.61265: checking for max_fail_percentage 13531 1726882433.61267: done checking for max_fail_percentage 13531 1726882433.61268: checking to see if all hosts have failed and the running result is not ok 13531 1726882433.61268: done checking to see if all hosts have failed 13531 1726882433.61269: getting the remaining hosts for this loop 13531 1726882433.61270: done getting the remaining hosts for this loop 13531 1726882433.61274: getting the next task for host managed_node2 13531 1726882433.61279: done getting next task for host managed_node2 13531 1726882433.61281: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13531 1726882433.61284: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882433.61287: getting variables 13531 1726882433.61289: in VariableManager get_vars() 13531 1726882433.61340: Calling all_inventory to load vars for managed_node2 13531 1726882433.61343: Calling groups_inventory to load vars for managed_node2 13531 1726882433.61345: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882433.61360: Calling all_plugins_play to load vars for managed_node2 13531 1726882433.61365: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882433.61368: Calling groups_plugins_play to load vars for managed_node2 13531 1726882433.62305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882433.63238: done with get_vars() 13531 1726882433.63255: done getting variables 13531 1726882433.63300: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882433.63383: variable 'profile' from source: include params 13531 1726882433.63386: variable 'item' from source: include params 13531 1726882433.63427: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:53 -0400 (0:00:00.039) 0:00:21.529 ****** 13531 1726882433.63455: entering _queue_task() for managed_node2/assert 13531 1726882433.63681: worker is 1 (out of 1 available) 13531 1726882433.63694: exiting _queue_task() for managed_node2/assert 13531 1726882433.63705: done queuing things up, now waiting for results queue to drain 13531 1726882433.63707: waiting for pending results... 13531 1726882433.63885: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' 13531 1726882433.63960: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000035e 13531 1726882433.63970: variable 'ansible_search_path' from source: unknown 13531 1726882433.63974: variable 'ansible_search_path' from source: unknown 13531 1726882433.64004: calling self._execute() 13531 1726882433.64079: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.64082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.64091: variable 'omit' from source: magic vars 13531 1726882433.64349: variable 'ansible_distribution_major_version' from source: facts 13531 1726882433.64368: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882433.64379: variable 'omit' from source: magic vars 13531 1726882433.64406: variable 'omit' from source: magic vars 13531 1726882433.64475: variable 'profile' from source: include params 13531 1726882433.64480: variable 'item' from source: include params 13531 1726882433.64524: variable 'item' from source: include params 13531 1726882433.64540: variable 'omit' from source: magic vars 13531 1726882433.64580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882433.64605: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882433.64621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882433.64634: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882433.64645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882433.64671: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882433.64675: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.64677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.64748: Set connection var ansible_pipelining to False 13531 1726882433.64751: Set connection var ansible_timeout to 10 13531 1726882433.64757: Set connection var ansible_shell_executable to /bin/sh 13531 1726882433.64763: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882433.64767: Set connection var ansible_connection to ssh 13531 1726882433.64770: Set connection var ansible_shell_type to sh 13531 1726882433.64792: variable 'ansible_shell_executable' from source: unknown 13531 1726882433.64795: variable 'ansible_connection' from source: unknown 13531 1726882433.64799: variable 'ansible_module_compression' from source: unknown 13531 1726882433.64802: variable 'ansible_shell_type' from source: unknown 13531 1726882433.64805: variable 'ansible_shell_executable' from source: unknown 13531 1726882433.64807: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.64810: variable 'ansible_pipelining' from source: unknown 13531 1726882433.64813: variable 'ansible_timeout' from source: unknown 13531 1726882433.64815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.64908: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882433.64918: variable 'omit' from source: magic vars 13531 1726882433.64923: starting attempt loop 13531 1726882433.64927: running the handler 13531 1726882433.65000: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13531 1726882433.65005: Evaluated conditional (lsr_net_profile_ansible_managed): True 13531 1726882433.65010: handler run complete 13531 1726882433.65023: attempt loop complete, returning result 13531 1726882433.65026: _execute() done 13531 1726882433.65029: dumping result to json 13531 1726882433.65031: done dumping result, returning 13531 1726882433.65037: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0e448fcc-3ce9-4fd9-519d-00000000035e] 13531 1726882433.65048: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000035e 13531 1726882433.65134: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000035e 13531 1726882433.65137: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882433.65194: no more pending results, returning what we have 13531 1726882433.65197: results queue empty 13531 1726882433.65198: checking for any_errors_fatal 13531 1726882433.65204: done checking for any_errors_fatal 13531 1726882433.65205: checking for max_fail_percentage 13531 1726882433.65206: done checking for max_fail_percentage 13531 1726882433.65208: checking to see if all hosts have failed and the running result is not ok 13531 1726882433.65208: done checking to see if all hosts have failed 13531 1726882433.65209: getting the remaining hosts for this loop 13531 1726882433.65210: done getting the remaining hosts for this loop 13531 1726882433.65213: getting the next task for host managed_node2 13531 1726882433.65217: done getting next task for host managed_node2 13531 1726882433.65220: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13531 1726882433.65227: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882433.65230: getting variables 13531 1726882433.65232: in VariableManager get_vars() 13531 1726882433.65286: Calling all_inventory to load vars for managed_node2 13531 1726882433.65289: Calling groups_inventory to load vars for managed_node2 13531 1726882433.65292: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882433.65301: Calling all_plugins_play to load vars for managed_node2 13531 1726882433.65303: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882433.65305: Calling groups_plugins_play to load vars for managed_node2 13531 1726882433.66135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882433.67079: done with get_vars() 13531 1726882433.67098: done getting variables 13531 1726882433.67140: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882433.67224: variable 'profile' from source: include params 13531 1726882433.67227: variable 'item' from source: include params 13531 1726882433.67270: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:53 -0400 (0:00:00.038) 0:00:21.568 ****** 13531 1726882433.67297: entering _queue_task() for managed_node2/assert 13531 1726882433.67532: worker is 1 (out of 1 available) 13531 1726882433.67545: exiting _queue_task() for managed_node2/assert 13531 1726882433.67561: done queuing things up, now waiting for results queue to drain 13531 1726882433.67563: waiting for pending results... 13531 1726882433.67744: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.0 13531 1726882433.67815: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000035f 13531 1726882433.67829: variable 'ansible_search_path' from source: unknown 13531 1726882433.67832: variable 'ansible_search_path' from source: unknown 13531 1726882433.67862: calling self._execute() 13531 1726882433.67939: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.67943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.67955: variable 'omit' from source: magic vars 13531 1726882433.68221: variable 'ansible_distribution_major_version' from source: facts 13531 1726882433.68232: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882433.68237: variable 'omit' from source: magic vars 13531 1726882433.68269: variable 'omit' from source: magic vars 13531 1726882433.68337: variable 'profile' from source: include params 13531 1726882433.68342: variable 'item' from source: include params 13531 1726882433.68390: variable 'item' from source: include params 13531 1726882433.68403: variable 'omit' from source: magic vars 13531 1726882433.68443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882433.68473: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882433.68493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882433.68506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882433.68518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882433.68541: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882433.68544: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.68547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.68625: Set connection var ansible_pipelining to False 13531 1726882433.68629: Set connection var ansible_timeout to 10 13531 1726882433.68631: Set connection var ansible_shell_executable to /bin/sh 13531 1726882433.68635: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882433.68637: Set connection var ansible_connection to ssh 13531 1726882433.68640: Set connection var ansible_shell_type to sh 13531 1726882433.68664: variable 'ansible_shell_executable' from source: unknown 13531 1726882433.68667: variable 'ansible_connection' from source: unknown 13531 1726882433.68669: variable 'ansible_module_compression' from source: unknown 13531 1726882433.68674: variable 'ansible_shell_type' from source: unknown 13531 1726882433.68676: variable 'ansible_shell_executable' from source: unknown 13531 1726882433.68678: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.68680: variable 'ansible_pipelining' from source: unknown 13531 1726882433.68683: variable 'ansible_timeout' from source: unknown 13531 1726882433.68688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.68788: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882433.68798: variable 'omit' from source: magic vars 13531 1726882433.68807: starting attempt loop 13531 1726882433.68810: running the handler 13531 1726882433.68884: variable 'lsr_net_profile_fingerprint' from source: set_fact 13531 1726882433.68887: Evaluated conditional (lsr_net_profile_fingerprint): True 13531 1726882433.68892: handler run complete 13531 1726882433.68905: attempt loop complete, returning result 13531 1726882433.68907: _execute() done 13531 1726882433.68911: dumping result to json 13531 1726882433.68913: done dumping result, returning 13531 1726882433.68919: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.0 [0e448fcc-3ce9-4fd9-519d-00000000035f] 13531 1726882433.68925: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000035f 13531 1726882433.69006: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000035f 13531 1726882433.69010: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882433.69064: no more pending results, returning what we have 13531 1726882433.69068: results queue empty 13531 1726882433.69069: checking for any_errors_fatal 13531 1726882433.69076: done checking for any_errors_fatal 13531 1726882433.69077: checking for max_fail_percentage 13531 1726882433.69078: done checking for max_fail_percentage 13531 1726882433.69079: checking to see if all hosts have failed and the running result is not ok 13531 1726882433.69080: done checking to see if all hosts have failed 13531 1726882433.69081: getting the remaining hosts for this loop 13531 1726882433.69082: done getting the remaining hosts for this loop 13531 1726882433.69085: getting the next task for host managed_node2 13531 1726882433.69092: done getting next task for host managed_node2 13531 1726882433.69095: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13531 1726882433.69098: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882433.69102: getting variables 13531 1726882433.69104: in VariableManager get_vars() 13531 1726882433.69156: Calling all_inventory to load vars for managed_node2 13531 1726882433.69160: Calling groups_inventory to load vars for managed_node2 13531 1726882433.69167: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882433.69178: Calling all_plugins_play to load vars for managed_node2 13531 1726882433.69180: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882433.69183: Calling groups_plugins_play to load vars for managed_node2 13531 1726882433.70085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882433.71014: done with get_vars() 13531 1726882433.71029: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:53 -0400 (0:00:00.038) 0:00:21.606 ****** 13531 1726882433.71100: entering _queue_task() for managed_node2/include_tasks 13531 1726882433.71324: worker is 1 (out of 1 available) 13531 1726882433.71338: exiting _queue_task() for managed_node2/include_tasks 13531 1726882433.71352: done queuing things up, now waiting for results queue to drain 13531 1726882433.71353: waiting for pending results... 13531 1726882433.71537: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 13531 1726882433.71619: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000363 13531 1726882433.71629: variable 'ansible_search_path' from source: unknown 13531 1726882433.71633: variable 'ansible_search_path' from source: unknown 13531 1726882433.71667: calling self._execute() 13531 1726882433.71737: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.71742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.71749: variable 'omit' from source: magic vars 13531 1726882433.72023: variable 'ansible_distribution_major_version' from source: facts 13531 1726882433.72034: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882433.72039: _execute() done 13531 1726882433.72042: dumping result to json 13531 1726882433.72045: done dumping result, returning 13531 1726882433.72050: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-4fd9-519d-000000000363] 13531 1726882433.72060: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000363 13531 1726882433.72148: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000363 13531 1726882433.72151: WORKER PROCESS EXITING 13531 1726882433.72180: no more pending results, returning what we have 13531 1726882433.72184: in VariableManager get_vars() 13531 1726882433.72241: Calling all_inventory to load vars for managed_node2 13531 1726882433.72244: Calling groups_inventory to load vars for managed_node2 13531 1726882433.72246: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882433.72261: Calling all_plugins_play to load vars for managed_node2 13531 1726882433.72266: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882433.72269: Calling groups_plugins_play to load vars for managed_node2 13531 1726882433.73160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882433.74095: done with get_vars() 13531 1726882433.74111: variable 'ansible_search_path' from source: unknown 13531 1726882433.74112: variable 'ansible_search_path' from source: unknown 13531 1726882433.74141: we have included files to process 13531 1726882433.74142: generating all_blocks data 13531 1726882433.74143: done generating all_blocks data 13531 1726882433.74148: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13531 1726882433.74149: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13531 1726882433.74150: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13531 1726882433.74758: done processing included file 13531 1726882433.74759: iterating over new_blocks loaded from include file 13531 1726882433.74761: in VariableManager get_vars() 13531 1726882433.74782: done with get_vars() 13531 1726882433.74783: filtering new block on tags 13531 1726882433.74800: done filtering new block on tags 13531 1726882433.74802: in VariableManager get_vars() 13531 1726882433.74817: done with get_vars() 13531 1726882433.74818: filtering new block on tags 13531 1726882433.74832: done filtering new block on tags 13531 1726882433.74833: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 13531 1726882433.74836: extending task lists for all hosts with included blocks 13531 1726882433.74941: done extending task lists 13531 1726882433.74942: done processing included files 13531 1726882433.74943: results queue empty 13531 1726882433.74943: checking for any_errors_fatal 13531 1726882433.74945: done checking for any_errors_fatal 13531 1726882433.74946: checking for max_fail_percentage 13531 1726882433.74946: done checking for max_fail_percentage 13531 1726882433.74947: checking to see if all hosts have failed and the running result is not ok 13531 1726882433.74947: done checking to see if all hosts have failed 13531 1726882433.74948: getting the remaining hosts for this loop 13531 1726882433.74949: done getting the remaining hosts for this loop 13531 1726882433.74950: getting the next task for host managed_node2 13531 1726882433.74954: done getting next task for host managed_node2 13531 1726882433.74956: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13531 1726882433.74958: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882433.74959: getting variables 13531 1726882433.74960: in VariableManager get_vars() 13531 1726882433.74974: Calling all_inventory to load vars for managed_node2 13531 1726882433.74975: Calling groups_inventory to load vars for managed_node2 13531 1726882433.74977: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882433.74981: Calling all_plugins_play to load vars for managed_node2 13531 1726882433.74982: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882433.74985: Calling groups_plugins_play to load vars for managed_node2 13531 1726882433.75692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882433.76609: done with get_vars() 13531 1726882433.76629: done getting variables 13531 1726882433.76665: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:53 -0400 (0:00:00.055) 0:00:21.662 ****** 13531 1726882433.76690: entering _queue_task() for managed_node2/set_fact 13531 1726882433.76937: worker is 1 (out of 1 available) 13531 1726882433.76949: exiting _queue_task() for managed_node2/set_fact 13531 1726882433.76967: done queuing things up, now waiting for results queue to drain 13531 1726882433.76968: waiting for pending results... 13531 1726882433.77151: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 13531 1726882433.77235: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000674 13531 1726882433.77246: variable 'ansible_search_path' from source: unknown 13531 1726882433.77250: variable 'ansible_search_path' from source: unknown 13531 1726882433.77284: calling self._execute() 13531 1726882433.77359: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.77363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.77371: variable 'omit' from source: magic vars 13531 1726882433.77650: variable 'ansible_distribution_major_version' from source: facts 13531 1726882433.77665: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882433.77671: variable 'omit' from source: magic vars 13531 1726882433.77705: variable 'omit' from source: magic vars 13531 1726882433.77730: variable 'omit' from source: magic vars 13531 1726882433.77767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882433.77794: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882433.77812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882433.77831: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882433.77841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882433.77871: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882433.77874: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.77877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.77950: Set connection var ansible_pipelining to False 13531 1726882433.77958: Set connection var ansible_timeout to 10 13531 1726882433.77960: Set connection var ansible_shell_executable to /bin/sh 13531 1726882433.77970: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882433.77972: Set connection var ansible_connection to ssh 13531 1726882433.77975: Set connection var ansible_shell_type to sh 13531 1726882433.77994: variable 'ansible_shell_executable' from source: unknown 13531 1726882433.77997: variable 'ansible_connection' from source: unknown 13531 1726882433.78000: variable 'ansible_module_compression' from source: unknown 13531 1726882433.78002: variable 'ansible_shell_type' from source: unknown 13531 1726882433.78005: variable 'ansible_shell_executable' from source: unknown 13531 1726882433.78007: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.78009: variable 'ansible_pipelining' from source: unknown 13531 1726882433.78011: variable 'ansible_timeout' from source: unknown 13531 1726882433.78016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.78118: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882433.78127: variable 'omit' from source: magic vars 13531 1726882433.78133: starting attempt loop 13531 1726882433.78137: running the handler 13531 1726882433.78150: handler run complete 13531 1726882433.78159: attempt loop complete, returning result 13531 1726882433.78161: _execute() done 13531 1726882433.78165: dumping result to json 13531 1726882433.78168: done dumping result, returning 13531 1726882433.78174: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-4fd9-519d-000000000674] 13531 1726882433.78180: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000674 13531 1726882433.78272: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000674 13531 1726882433.78275: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13531 1726882433.78340: no more pending results, returning what we have 13531 1726882433.78344: results queue empty 13531 1726882433.78345: checking for any_errors_fatal 13531 1726882433.78347: done checking for any_errors_fatal 13531 1726882433.78348: checking for max_fail_percentage 13531 1726882433.78349: done checking for max_fail_percentage 13531 1726882433.78350: checking to see if all hosts have failed and the running result is not ok 13531 1726882433.78351: done checking to see if all hosts have failed 13531 1726882433.78352: getting the remaining hosts for this loop 13531 1726882433.78359: done getting the remaining hosts for this loop 13531 1726882433.78362: getting the next task for host managed_node2 13531 1726882433.78371: done getting next task for host managed_node2 13531 1726882433.78374: ^ task is: TASK: Stat profile file 13531 1726882433.78378: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882433.78381: getting variables 13531 1726882433.78383: in VariableManager get_vars() 13531 1726882433.78428: Calling all_inventory to load vars for managed_node2 13531 1726882433.78431: Calling groups_inventory to load vars for managed_node2 13531 1726882433.78433: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882433.78442: Calling all_plugins_play to load vars for managed_node2 13531 1726882433.78445: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882433.78447: Calling groups_plugins_play to load vars for managed_node2 13531 1726882433.82790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882433.83731: done with get_vars() 13531 1726882433.83751: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:53 -0400 (0:00:00.071) 0:00:21.733 ****** 13531 1726882433.83843: entering _queue_task() for managed_node2/stat 13531 1726882433.84094: worker is 1 (out of 1 available) 13531 1726882433.84107: exiting _queue_task() for managed_node2/stat 13531 1726882433.84119: done queuing things up, now waiting for results queue to drain 13531 1726882433.84120: waiting for pending results... 13531 1726882433.84304: running TaskExecutor() for managed_node2/TASK: Stat profile file 13531 1726882433.84392: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000675 13531 1726882433.84403: variable 'ansible_search_path' from source: unknown 13531 1726882433.84407: variable 'ansible_search_path' from source: unknown 13531 1726882433.84441: calling self._execute() 13531 1726882433.84517: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.84520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.84529: variable 'omit' from source: magic vars 13531 1726882433.84814: variable 'ansible_distribution_major_version' from source: facts 13531 1726882433.84823: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882433.84830: variable 'omit' from source: magic vars 13531 1726882433.84863: variable 'omit' from source: magic vars 13531 1726882433.84931: variable 'profile' from source: include params 13531 1726882433.84935: variable 'item' from source: include params 13531 1726882433.84982: variable 'item' from source: include params 13531 1726882433.85001: variable 'omit' from source: magic vars 13531 1726882433.85039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882433.85069: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882433.85087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882433.85106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882433.85115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882433.85139: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882433.85142: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.85145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.85226: Set connection var ansible_pipelining to False 13531 1726882433.85229: Set connection var ansible_timeout to 10 13531 1726882433.85235: Set connection var ansible_shell_executable to /bin/sh 13531 1726882433.85240: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882433.85243: Set connection var ansible_connection to ssh 13531 1726882433.85246: Set connection var ansible_shell_type to sh 13531 1726882433.85275: variable 'ansible_shell_executable' from source: unknown 13531 1726882433.85283: variable 'ansible_connection' from source: unknown 13531 1726882433.85289: variable 'ansible_module_compression' from source: unknown 13531 1726882433.85294: variable 'ansible_shell_type' from source: unknown 13531 1726882433.85299: variable 'ansible_shell_executable' from source: unknown 13531 1726882433.85310: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882433.85319: variable 'ansible_pipelining' from source: unknown 13531 1726882433.85326: variable 'ansible_timeout' from source: unknown 13531 1726882433.85333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882433.85550: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882433.85574: variable 'omit' from source: magic vars 13531 1726882433.85585: starting attempt loop 13531 1726882433.85591: running the handler 13531 1726882433.85608: _low_level_execute_command(): starting 13531 1726882433.85620: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882433.86641: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882433.86789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.86793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.86797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.86838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.86851: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882433.86875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.86897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882433.86911: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882433.86924: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882433.86936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.86951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.86974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.86992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.87005: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882433.87019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.87215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882433.87239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882433.87261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882433.87406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882433.89071: stdout chunk (state=3): >>>/root <<< 13531 1726882433.89171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882433.89224: stderr chunk (state=3): >>><<< 13531 1726882433.89227: stdout chunk (state=3): >>><<< 13531 1726882433.89253: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882433.89267: _low_level_execute_command(): starting 13531 1726882433.89275: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882433.8925025-14494-169224679966894 `" && echo ansible-tmp-1726882433.8925025-14494-169224679966894="` echo /root/.ansible/tmp/ansible-tmp-1726882433.8925025-14494-169224679966894 `" ) && sleep 0' 13531 1726882433.89870: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882433.89874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.89876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.89879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.89969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.89972: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882433.89975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.89977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882433.89980: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882433.89982: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882433.89984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.89986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.89989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.90017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.90020: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882433.90023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.90487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882433.90506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882433.90637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882433.92539: stdout chunk (state=3): >>>ansible-tmp-1726882433.8925025-14494-169224679966894=/root/.ansible/tmp/ansible-tmp-1726882433.8925025-14494-169224679966894 <<< 13531 1726882433.92740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882433.92743: stdout chunk (state=3): >>><<< 13531 1726882433.92746: stderr chunk (state=3): >>><<< 13531 1726882433.93097: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882433.8925025-14494-169224679966894=/root/.ansible/tmp/ansible-tmp-1726882433.8925025-14494-169224679966894 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882433.93101: variable 'ansible_module_compression' from source: unknown 13531 1726882433.93104: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13531 1726882433.93107: variable 'ansible_facts' from source: unknown 13531 1726882433.93113: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882433.8925025-14494-169224679966894/AnsiballZ_stat.py 13531 1726882433.93183: Sending initial data 13531 1726882433.93186: Sent initial data (153 bytes) 13531 1726882433.94194: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882433.94221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.94243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.94268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.94312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.94325: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882433.94340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.94361: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882433.94381: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882433.94392: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882433.94404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.94417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.94432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.94443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.94456: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882433.94475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.94551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882433.94572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882433.94587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882433.94883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882433.96568: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882433.96656: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882433.96755: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpb5sbs_y7 /root/.ansible/tmp/ansible-tmp-1726882433.8925025-14494-169224679966894/AnsiballZ_stat.py <<< 13531 1726882433.96848: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882433.98237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882433.98431: stderr chunk (state=3): >>><<< 13531 1726882433.98435: stdout chunk (state=3): >>><<< 13531 1726882433.98438: done transferring module to remote 13531 1726882433.98441: _low_level_execute_command(): starting 13531 1726882433.98448: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882433.8925025-14494-169224679966894/ /root/.ansible/tmp/ansible-tmp-1726882433.8925025-14494-169224679966894/AnsiballZ_stat.py && sleep 0' 13531 1726882433.99115: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882433.99131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.99150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.99184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.99232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.99243: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882433.99259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.99280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882433.99292: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882433.99302: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882433.99314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882433.99333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882433.99348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882433.99362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882433.99375: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882433.99388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882433.99473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882433.99494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882433.99508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882433.99638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882434.01492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882434.01589: stderr chunk (state=3): >>><<< 13531 1726882434.01595: stdout chunk (state=3): >>><<< 13531 1726882434.01698: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882434.01702: _low_level_execute_command(): starting 13531 1726882434.01707: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882433.8925025-14494-169224679966894/AnsiballZ_stat.py && sleep 0' 13531 1726882434.02507: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882434.02523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882434.02537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.02557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.02607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882434.02619: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882434.02632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.02648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882434.02661: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882434.02681: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882434.02696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882434.02711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.02728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.02739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882434.02749: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882434.02768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.02847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882434.02870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882434.02885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882434.03042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882434.16299: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13531 1726882434.17394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882434.17398: stdout chunk (state=3): >>><<< 13531 1726882434.17401: stderr chunk (state=3): >>><<< 13531 1726882434.17562: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882434.17572: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882433.8925025-14494-169224679966894/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882434.17576: _low_level_execute_command(): starting 13531 1726882434.17578: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882433.8925025-14494-169224679966894/ > /dev/null 2>&1 && sleep 0' 13531 1726882434.19289: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882434.19307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882434.19324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.19343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.19395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882434.19408: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882434.19422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.19440: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882434.19451: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882434.19467: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882434.19481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882434.19494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.19509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.19521: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882434.19532: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882434.19544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.19625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882434.19648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882434.19669: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882434.19804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882434.21666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882434.21770: stderr chunk (state=3): >>><<< 13531 1726882434.21782: stdout chunk (state=3): >>><<< 13531 1726882434.21869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882434.21872: handler run complete 13531 1726882434.21874: attempt loop complete, returning result 13531 1726882434.21876: _execute() done 13531 1726882434.21878: dumping result to json 13531 1726882434.21880: done dumping result, returning 13531 1726882434.21882: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-4fd9-519d-000000000675] 13531 1726882434.21884: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000675 13531 1726882434.22144: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000675 13531 1726882434.22147: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 13531 1726882434.22246: no more pending results, returning what we have 13531 1726882434.22249: results queue empty 13531 1726882434.22250: checking for any_errors_fatal 13531 1726882434.22276: done checking for any_errors_fatal 13531 1726882434.22277: checking for max_fail_percentage 13531 1726882434.22279: done checking for max_fail_percentage 13531 1726882434.22280: checking to see if all hosts have failed and the running result is not ok 13531 1726882434.22281: done checking to see if all hosts have failed 13531 1726882434.22282: getting the remaining hosts for this loop 13531 1726882434.22285: done getting the remaining hosts for this loop 13531 1726882434.22290: getting the next task for host managed_node2 13531 1726882434.22297: done getting next task for host managed_node2 13531 1726882434.22300: ^ task is: TASK: Set NM profile exist flag based on the profile files 13531 1726882434.22305: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882434.22314: getting variables 13531 1726882434.22316: in VariableManager get_vars() 13531 1726882434.22379: Calling all_inventory to load vars for managed_node2 13531 1726882434.22384: Calling groups_inventory to load vars for managed_node2 13531 1726882434.22387: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882434.22400: Calling all_plugins_play to load vars for managed_node2 13531 1726882434.22403: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882434.22405: Calling groups_plugins_play to load vars for managed_node2 13531 1726882434.25289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882434.27202: done with get_vars() 13531 1726882434.27230: done getting variables 13531 1726882434.27300: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:54 -0400 (0:00:00.434) 0:00:22.168 ****** 13531 1726882434.27337: entering _queue_task() for managed_node2/set_fact 13531 1726882434.27863: worker is 1 (out of 1 available) 13531 1726882434.27878: exiting _queue_task() for managed_node2/set_fact 13531 1726882434.27892: done queuing things up, now waiting for results queue to drain 13531 1726882434.27897: waiting for pending results... 13531 1726882434.28210: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 13531 1726882434.28366: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000676 13531 1726882434.28385: variable 'ansible_search_path' from source: unknown 13531 1726882434.28393: variable 'ansible_search_path' from source: unknown 13531 1726882434.28434: calling self._execute() 13531 1726882434.28550: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882434.28602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882434.28616: variable 'omit' from source: magic vars 13531 1726882434.29033: variable 'ansible_distribution_major_version' from source: facts 13531 1726882434.29051: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882434.29194: variable 'profile_stat' from source: set_fact 13531 1726882434.29213: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882434.29228: when evaluation is False, skipping this task 13531 1726882434.29236: _execute() done 13531 1726882434.29242: dumping result to json 13531 1726882434.29249: done dumping result, returning 13531 1726882434.29262: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-4fd9-519d-000000000676] 13531 1726882434.29277: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000676 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882434.29425: no more pending results, returning what we have 13531 1726882434.29429: results queue empty 13531 1726882434.29430: checking for any_errors_fatal 13531 1726882434.29438: done checking for any_errors_fatal 13531 1726882434.29439: checking for max_fail_percentage 13531 1726882434.29440: done checking for max_fail_percentage 13531 1726882434.29441: checking to see if all hosts have failed and the running result is not ok 13531 1726882434.29442: done checking to see if all hosts have failed 13531 1726882434.29443: getting the remaining hosts for this loop 13531 1726882434.29444: done getting the remaining hosts for this loop 13531 1726882434.29448: getting the next task for host managed_node2 13531 1726882434.29457: done getting next task for host managed_node2 13531 1726882434.29460: ^ task is: TASK: Get NM profile info 13531 1726882434.29466: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882434.29471: getting variables 13531 1726882434.29473: in VariableManager get_vars() 13531 1726882434.29532: Calling all_inventory to load vars for managed_node2 13531 1726882434.29535: Calling groups_inventory to load vars for managed_node2 13531 1726882434.29537: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882434.29552: Calling all_plugins_play to load vars for managed_node2 13531 1726882434.29558: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882434.29561: Calling groups_plugins_play to load vars for managed_node2 13531 1726882434.30640: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000676 13531 1726882434.30645: WORKER PROCESS EXITING 13531 1726882434.31447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882434.33449: done with get_vars() 13531 1726882434.33477: done getting variables 13531 1726882434.33542: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:54 -0400 (0:00:00.062) 0:00:22.231 ****** 13531 1726882434.33580: entering _queue_task() for managed_node2/shell 13531 1726882434.33889: worker is 1 (out of 1 available) 13531 1726882434.33901: exiting _queue_task() for managed_node2/shell 13531 1726882434.33913: done queuing things up, now waiting for results queue to drain 13531 1726882434.33915: waiting for pending results... 13531 1726882434.34247: running TaskExecutor() for managed_node2/TASK: Get NM profile info 13531 1726882434.34419: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000677 13531 1726882434.34438: variable 'ansible_search_path' from source: unknown 13531 1726882434.34446: variable 'ansible_search_path' from source: unknown 13531 1726882434.34504: calling self._execute() 13531 1726882434.34607: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882434.34621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882434.34636: variable 'omit' from source: magic vars 13531 1726882434.35073: variable 'ansible_distribution_major_version' from source: facts 13531 1726882434.35091: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882434.35101: variable 'omit' from source: magic vars 13531 1726882434.35161: variable 'omit' from source: magic vars 13531 1726882434.35282: variable 'profile' from source: include params 13531 1726882434.35292: variable 'item' from source: include params 13531 1726882434.35372: variable 'item' from source: include params 13531 1726882434.35401: variable 'omit' from source: magic vars 13531 1726882434.35467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882434.35510: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882434.35545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882434.35582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882434.35598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882434.35636: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882434.35649: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882434.35660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882434.35800: Set connection var ansible_pipelining to False 13531 1726882434.35810: Set connection var ansible_timeout to 10 13531 1726882434.35818: Set connection var ansible_shell_executable to /bin/sh 13531 1726882434.35825: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882434.35830: Set connection var ansible_connection to ssh 13531 1726882434.35835: Set connection var ansible_shell_type to sh 13531 1726882434.35869: variable 'ansible_shell_executable' from source: unknown 13531 1726882434.35877: variable 'ansible_connection' from source: unknown 13531 1726882434.35883: variable 'ansible_module_compression' from source: unknown 13531 1726882434.35895: variable 'ansible_shell_type' from source: unknown 13531 1726882434.35906: variable 'ansible_shell_executable' from source: unknown 13531 1726882434.35911: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882434.35917: variable 'ansible_pipelining' from source: unknown 13531 1726882434.35924: variable 'ansible_timeout' from source: unknown 13531 1726882434.35940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882434.36118: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882434.36139: variable 'omit' from source: magic vars 13531 1726882434.36150: starting attempt loop 13531 1726882434.36161: running the handler 13531 1726882434.36178: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882434.36201: _low_level_execute_command(): starting 13531 1726882434.36212: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882434.37663: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882434.37675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882434.37687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.37704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.37747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882434.37758: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882434.37762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.37777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882434.37785: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882434.37793: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882434.37800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882434.37809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.37821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.37828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882434.37836: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882434.37846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.37917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882434.37931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882434.37941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882434.38084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882434.39745: stdout chunk (state=3): >>>/root <<< 13531 1726882434.39849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882434.39922: stderr chunk (state=3): >>><<< 13531 1726882434.39925: stdout chunk (state=3): >>><<< 13531 1726882434.40029: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882434.40039: _low_level_execute_command(): starting 13531 1726882434.40042: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882434.3994405-14524-194613849882424 `" && echo ansible-tmp-1726882434.3994405-14524-194613849882424="` echo /root/.ansible/tmp/ansible-tmp-1726882434.3994405-14524-194613849882424 `" ) && sleep 0' 13531 1726882434.41244: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882434.41248: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.41250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.41294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882434.41297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.41300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.41376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882434.41389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882434.41516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882434.43396: stdout chunk (state=3): >>>ansible-tmp-1726882434.3994405-14524-194613849882424=/root/.ansible/tmp/ansible-tmp-1726882434.3994405-14524-194613849882424 <<< 13531 1726882434.43512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882434.43596: stderr chunk (state=3): >>><<< 13531 1726882434.43608: stdout chunk (state=3): >>><<< 13531 1726882434.43875: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882434.3994405-14524-194613849882424=/root/.ansible/tmp/ansible-tmp-1726882434.3994405-14524-194613849882424 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882434.43878: variable 'ansible_module_compression' from source: unknown 13531 1726882434.43881: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13531 1726882434.43883: variable 'ansible_facts' from source: unknown 13531 1726882434.43885: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882434.3994405-14524-194613849882424/AnsiballZ_command.py 13531 1726882434.44009: Sending initial data 13531 1726882434.44012: Sent initial data (156 bytes) 13531 1726882434.44974: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882434.44990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882434.45005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.45023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.45066: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882434.45086: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882434.45100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.45117: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882434.45128: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882434.45138: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882434.45149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882434.45162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.45184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.45202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882434.45214: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882434.45227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.45311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882434.45332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882434.45346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882434.45479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882434.47287: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882434.47382: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882434.47490: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpoz5o10aq /root/.ansible/tmp/ansible-tmp-1726882434.3994405-14524-194613849882424/AnsiballZ_command.py <<< 13531 1726882434.47585: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882434.48616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882434.48769: stderr chunk (state=3): >>><<< 13531 1726882434.48772: stdout chunk (state=3): >>><<< 13531 1726882434.48776: done transferring module to remote 13531 1726882434.48778: _low_level_execute_command(): starting 13531 1726882434.48780: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882434.3994405-14524-194613849882424/ /root/.ansible/tmp/ansible-tmp-1726882434.3994405-14524-194613849882424/AnsiballZ_command.py && sleep 0' 13531 1726882434.49191: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882434.49197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.49228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.49235: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882434.49241: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 13531 1726882434.49246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882434.49258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.49263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.49277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.49332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882434.49335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882434.49443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882434.51238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882434.51289: stderr chunk (state=3): >>><<< 13531 1726882434.51296: stdout chunk (state=3): >>><<< 13531 1726882434.51312: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882434.51320: _low_level_execute_command(): starting 13531 1726882434.51328: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882434.3994405-14524-194613849882424/AnsiballZ_command.py && sleep 0' 13531 1726882434.51829: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882434.51843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882434.51856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.51876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.51914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882434.51927: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882434.51940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.51957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882434.51971: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882434.51982: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882434.51994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882434.52008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.52024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.52036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882434.52049: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882434.52062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.52138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882434.52160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882434.52181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882434.52316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882434.67970: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:33:54.654030", "end": "2024-09-20 21:33:54.677785", "delta": "0:00:00.023755", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882434.69371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882434.69395: stdout chunk (state=3): >>><<< 13531 1726882434.69398: stderr chunk (state=3): >>><<< 13531 1726882434.69535: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:33:54.654030", "end": "2024-09-20 21:33:54.677785", "delta": "0:00:00.023755", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882434.69545: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882434.3994405-14524-194613849882424/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882434.69548: _low_level_execute_command(): starting 13531 1726882434.69550: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882434.3994405-14524-194613849882424/ > /dev/null 2>&1 && sleep 0' 13531 1726882434.70155: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882434.70172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882434.70187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.70211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.70253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882434.70268: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882434.70283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.70301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882434.70315: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882434.70330: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882434.70343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882434.70357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882434.70375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882434.70388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882434.70399: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882434.70412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882434.70495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882434.70519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882434.70544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882434.70682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882434.72520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882434.72603: stderr chunk (state=3): >>><<< 13531 1726882434.72612: stdout chunk (state=3): >>><<< 13531 1726882434.72873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882434.72876: handler run complete 13531 1726882434.72878: Evaluated conditional (False): False 13531 1726882434.72881: attempt loop complete, returning result 13531 1726882434.72882: _execute() done 13531 1726882434.72884: dumping result to json 13531 1726882434.72886: done dumping result, returning 13531 1726882434.72888: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-4fd9-519d-000000000677] 13531 1726882434.72890: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000677 13531 1726882434.72964: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000677 13531 1726882434.72968: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.023755", "end": "2024-09-20 21:33:54.677785", "rc": 0, "start": "2024-09-20 21:33:54.654030" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 13531 1726882434.73044: no more pending results, returning what we have 13531 1726882434.73048: results queue empty 13531 1726882434.73049: checking for any_errors_fatal 13531 1726882434.73053: done checking for any_errors_fatal 13531 1726882434.73054: checking for max_fail_percentage 13531 1726882434.73056: done checking for max_fail_percentage 13531 1726882434.73057: checking to see if all hosts have failed and the running result is not ok 13531 1726882434.73058: done checking to see if all hosts have failed 13531 1726882434.73058: getting the remaining hosts for this loop 13531 1726882434.73060: done getting the remaining hosts for this loop 13531 1726882434.73065: getting the next task for host managed_node2 13531 1726882434.73072: done getting next task for host managed_node2 13531 1726882434.73074: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13531 1726882434.73079: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882434.73083: getting variables 13531 1726882434.73085: in VariableManager get_vars() 13531 1726882434.73141: Calling all_inventory to load vars for managed_node2 13531 1726882434.73144: Calling groups_inventory to load vars for managed_node2 13531 1726882434.73147: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882434.73158: Calling all_plugins_play to load vars for managed_node2 13531 1726882434.73161: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882434.73169: Calling groups_plugins_play to load vars for managed_node2 13531 1726882434.74920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882434.76688: done with get_vars() 13531 1726882434.76717: done getting variables 13531 1726882434.76785: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:54 -0400 (0:00:00.432) 0:00:22.663 ****** 13531 1726882434.76818: entering _queue_task() for managed_node2/set_fact 13531 1726882434.77168: worker is 1 (out of 1 available) 13531 1726882434.77185: exiting _queue_task() for managed_node2/set_fact 13531 1726882434.77197: done queuing things up, now waiting for results queue to drain 13531 1726882434.77199: waiting for pending results... 13531 1726882434.77502: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13531 1726882434.77636: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000678 13531 1726882434.77655: variable 'ansible_search_path' from source: unknown 13531 1726882434.77662: variable 'ansible_search_path' from source: unknown 13531 1726882434.77703: calling self._execute() 13531 1726882434.77799: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882434.77809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882434.77820: variable 'omit' from source: magic vars 13531 1726882434.78210: variable 'ansible_distribution_major_version' from source: facts 13531 1726882434.78228: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882434.78361: variable 'nm_profile_exists' from source: set_fact 13531 1726882434.78387: Evaluated conditional (nm_profile_exists.rc == 0): True 13531 1726882434.78402: variable 'omit' from source: magic vars 13531 1726882434.78453: variable 'omit' from source: magic vars 13531 1726882434.78500: variable 'omit' from source: magic vars 13531 1726882434.78549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882434.78588: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882434.78624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882434.78647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882434.78668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882434.78706: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882434.78715: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882434.78727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882434.78836: Set connection var ansible_pipelining to False 13531 1726882434.78847: Set connection var ansible_timeout to 10 13531 1726882434.78856: Set connection var ansible_shell_executable to /bin/sh 13531 1726882434.78870: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882434.78877: Set connection var ansible_connection to ssh 13531 1726882434.78884: Set connection var ansible_shell_type to sh 13531 1726882434.78913: variable 'ansible_shell_executable' from source: unknown 13531 1726882434.78927: variable 'ansible_connection' from source: unknown 13531 1726882434.78936: variable 'ansible_module_compression' from source: unknown 13531 1726882434.78946: variable 'ansible_shell_type' from source: unknown 13531 1726882434.78953: variable 'ansible_shell_executable' from source: unknown 13531 1726882434.78960: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882434.78970: variable 'ansible_pipelining' from source: unknown 13531 1726882434.78978: variable 'ansible_timeout' from source: unknown 13531 1726882434.78987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882434.79132: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882434.79157: variable 'omit' from source: magic vars 13531 1726882434.79171: starting attempt loop 13531 1726882434.79179: running the handler 13531 1726882434.79197: handler run complete 13531 1726882434.79212: attempt loop complete, returning result 13531 1726882434.79219: _execute() done 13531 1726882434.79225: dumping result to json 13531 1726882434.79232: done dumping result, returning 13531 1726882434.79245: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-4fd9-519d-000000000678] 13531 1726882434.79260: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000678 13531 1726882434.79376: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000678 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13531 1726882434.79432: no more pending results, returning what we have 13531 1726882434.79435: results queue empty 13531 1726882434.79436: checking for any_errors_fatal 13531 1726882434.79445: done checking for any_errors_fatal 13531 1726882434.79446: checking for max_fail_percentage 13531 1726882434.79448: done checking for max_fail_percentage 13531 1726882434.79450: checking to see if all hosts have failed and the running result is not ok 13531 1726882434.79450: done checking to see if all hosts have failed 13531 1726882434.79451: getting the remaining hosts for this loop 13531 1726882434.79452: done getting the remaining hosts for this loop 13531 1726882434.79456: getting the next task for host managed_node2 13531 1726882434.79468: done getting next task for host managed_node2 13531 1726882434.79471: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13531 1726882434.79478: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882434.79483: getting variables 13531 1726882434.79485: in VariableManager get_vars() 13531 1726882434.79545: Calling all_inventory to load vars for managed_node2 13531 1726882434.79548: Calling groups_inventory to load vars for managed_node2 13531 1726882434.79551: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882434.79563: Calling all_plugins_play to load vars for managed_node2 13531 1726882434.79568: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882434.79572: Calling groups_plugins_play to load vars for managed_node2 13531 1726882434.80646: WORKER PROCESS EXITING 13531 1726882434.81241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882434.82209: done with get_vars() 13531 1726882434.82225: done getting variables 13531 1726882434.82274: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882434.82362: variable 'profile' from source: include params 13531 1726882434.82366: variable 'item' from source: include params 13531 1726882434.82410: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:54 -0400 (0:00:00.056) 0:00:22.719 ****** 13531 1726882434.82437: entering _queue_task() for managed_node2/command 13531 1726882434.82690: worker is 1 (out of 1 available) 13531 1726882434.82703: exiting _queue_task() for managed_node2/command 13531 1726882434.82713: done queuing things up, now waiting for results queue to drain 13531 1726882434.82714: waiting for pending results... 13531 1726882434.83000: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 13531 1726882434.83108: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000067a 13531 1726882434.83121: variable 'ansible_search_path' from source: unknown 13531 1726882434.83125: variable 'ansible_search_path' from source: unknown 13531 1726882434.83173: calling self._execute() 13531 1726882434.83274: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882434.83281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882434.83289: variable 'omit' from source: magic vars 13531 1726882434.83698: variable 'ansible_distribution_major_version' from source: facts 13531 1726882434.83717: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882434.83924: variable 'profile_stat' from source: set_fact 13531 1726882434.83968: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882434.83971: when evaluation is False, skipping this task 13531 1726882434.83974: _execute() done 13531 1726882434.83976: dumping result to json 13531 1726882434.83978: done dumping result, returning 13531 1726882434.83994: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0e448fcc-3ce9-4fd9-519d-00000000067a] 13531 1726882434.83998: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000067a skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882434.84158: no more pending results, returning what we have 13531 1726882434.84165: results queue empty 13531 1726882434.84166: checking for any_errors_fatal 13531 1726882434.84177: done checking for any_errors_fatal 13531 1726882434.84178: checking for max_fail_percentage 13531 1726882434.84179: done checking for max_fail_percentage 13531 1726882434.84180: checking to see if all hosts have failed and the running result is not ok 13531 1726882434.84181: done checking to see if all hosts have failed 13531 1726882434.84182: getting the remaining hosts for this loop 13531 1726882434.84183: done getting the remaining hosts for this loop 13531 1726882434.84187: getting the next task for host managed_node2 13531 1726882434.84194: done getting next task for host managed_node2 13531 1726882434.84196: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13531 1726882434.84200: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882434.84204: getting variables 13531 1726882434.84206: in VariableManager get_vars() 13531 1726882434.84257: Calling all_inventory to load vars for managed_node2 13531 1726882434.84261: Calling groups_inventory to load vars for managed_node2 13531 1726882434.84265: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882434.84276: Calling all_plugins_play to load vars for managed_node2 13531 1726882434.84278: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882434.84282: Calling groups_plugins_play to load vars for managed_node2 13531 1726882434.84799: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000067a 13531 1726882434.84803: WORKER PROCESS EXITING 13531 1726882434.85223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882434.86307: done with get_vars() 13531 1726882434.86333: done getting variables 13531 1726882434.86393: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882434.86511: variable 'profile' from source: include params 13531 1726882434.86515: variable 'item' from source: include params 13531 1726882434.86578: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:54 -0400 (0:00:00.041) 0:00:22.761 ****** 13531 1726882434.86609: entering _queue_task() for managed_node2/set_fact 13531 1726882434.86931: worker is 1 (out of 1 available) 13531 1726882434.86943: exiting _queue_task() for managed_node2/set_fact 13531 1726882434.86959: done queuing things up, now waiting for results queue to drain 13531 1726882434.86960: waiting for pending results... 13531 1726882434.87275: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 13531 1726882434.87380: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000067b 13531 1726882434.87392: variable 'ansible_search_path' from source: unknown 13531 1726882434.87396: variable 'ansible_search_path' from source: unknown 13531 1726882434.87434: calling self._execute() 13531 1726882434.87538: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882434.87541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882434.87551: variable 'omit' from source: magic vars 13531 1726882434.87926: variable 'ansible_distribution_major_version' from source: facts 13531 1726882434.87943: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882434.88079: variable 'profile_stat' from source: set_fact 13531 1726882434.88085: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882434.88088: when evaluation is False, skipping this task 13531 1726882434.88091: _execute() done 13531 1726882434.88094: dumping result to json 13531 1726882434.88096: done dumping result, returning 13531 1726882434.88108: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0e448fcc-3ce9-4fd9-519d-00000000067b] 13531 1726882434.88111: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000067b 13531 1726882434.88201: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000067b 13531 1726882434.88205: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882434.88263: no more pending results, returning what we have 13531 1726882434.88268: results queue empty 13531 1726882434.88269: checking for any_errors_fatal 13531 1726882434.88277: done checking for any_errors_fatal 13531 1726882434.88278: checking for max_fail_percentage 13531 1726882434.88280: done checking for max_fail_percentage 13531 1726882434.88281: checking to see if all hosts have failed and the running result is not ok 13531 1726882434.88282: done checking to see if all hosts have failed 13531 1726882434.88282: getting the remaining hosts for this loop 13531 1726882434.88283: done getting the remaining hosts for this loop 13531 1726882434.88286: getting the next task for host managed_node2 13531 1726882434.88293: done getting next task for host managed_node2 13531 1726882434.88295: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13531 1726882434.88300: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882434.88303: getting variables 13531 1726882434.88305: in VariableManager get_vars() 13531 1726882434.88375: Calling all_inventory to load vars for managed_node2 13531 1726882434.88378: Calling groups_inventory to load vars for managed_node2 13531 1726882434.88380: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882434.88393: Calling all_plugins_play to load vars for managed_node2 13531 1726882434.88396: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882434.88399: Calling groups_plugins_play to load vars for managed_node2 13531 1726882434.90040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882434.92719: done with get_vars() 13531 1726882434.92750: done getting variables 13531 1726882434.92819: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882434.92944: variable 'profile' from source: include params 13531 1726882434.92948: variable 'item' from source: include params 13531 1726882434.93027: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:54 -0400 (0:00:00.064) 0:00:22.825 ****** 13531 1726882434.93057: entering _queue_task() for managed_node2/command 13531 1726882434.93307: worker is 1 (out of 1 available) 13531 1726882434.93319: exiting _queue_task() for managed_node2/command 13531 1726882434.93331: done queuing things up, now waiting for results queue to drain 13531 1726882434.93332: waiting for pending results... 13531 1726882434.93520: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 13531 1726882434.93592: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000067c 13531 1726882434.93603: variable 'ansible_search_path' from source: unknown 13531 1726882434.93607: variable 'ansible_search_path' from source: unknown 13531 1726882434.93638: calling self._execute() 13531 1726882434.93709: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882434.93713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882434.93721: variable 'omit' from source: magic vars 13531 1726882434.93993: variable 'ansible_distribution_major_version' from source: facts 13531 1726882434.94002: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882434.94090: variable 'profile_stat' from source: set_fact 13531 1726882434.94100: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882434.94104: when evaluation is False, skipping this task 13531 1726882434.94107: _execute() done 13531 1726882434.94110: dumping result to json 13531 1726882434.94113: done dumping result, returning 13531 1726882434.94117: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0e448fcc-3ce9-4fd9-519d-00000000067c] 13531 1726882434.94123: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000067c 13531 1726882434.94206: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000067c 13531 1726882434.94209: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882434.94280: no more pending results, returning what we have 13531 1726882434.94284: results queue empty 13531 1726882434.94285: checking for any_errors_fatal 13531 1726882434.94291: done checking for any_errors_fatal 13531 1726882434.94292: checking for max_fail_percentage 13531 1726882434.94294: done checking for max_fail_percentage 13531 1726882434.94294: checking to see if all hosts have failed and the running result is not ok 13531 1726882434.94295: done checking to see if all hosts have failed 13531 1726882434.94296: getting the remaining hosts for this loop 13531 1726882434.94297: done getting the remaining hosts for this loop 13531 1726882434.94300: getting the next task for host managed_node2 13531 1726882434.94306: done getting next task for host managed_node2 13531 1726882434.94308: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13531 1726882434.94312: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882434.94316: getting variables 13531 1726882434.94321: in VariableManager get_vars() 13531 1726882434.94373: Calling all_inventory to load vars for managed_node2 13531 1726882434.94376: Calling groups_inventory to load vars for managed_node2 13531 1726882434.94378: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882434.94388: Calling all_plugins_play to load vars for managed_node2 13531 1726882434.94390: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882434.94392: Calling groups_plugins_play to load vars for managed_node2 13531 1726882434.95311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882434.96688: done with get_vars() 13531 1726882434.96705: done getting variables 13531 1726882434.96745: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882434.96826: variable 'profile' from source: include params 13531 1726882434.96829: variable 'item' from source: include params 13531 1726882434.96870: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:54 -0400 (0:00:00.038) 0:00:22.864 ****** 13531 1726882434.96895: entering _queue_task() for managed_node2/set_fact 13531 1726882434.97114: worker is 1 (out of 1 available) 13531 1726882434.97128: exiting _queue_task() for managed_node2/set_fact 13531 1726882434.97142: done queuing things up, now waiting for results queue to drain 13531 1726882434.97143: waiting for pending results... 13531 1726882434.97324: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 13531 1726882434.97403: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000067d 13531 1726882434.97413: variable 'ansible_search_path' from source: unknown 13531 1726882434.97417: variable 'ansible_search_path' from source: unknown 13531 1726882434.97448: calling self._execute() 13531 1726882434.97519: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882434.97522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882434.97532: variable 'omit' from source: magic vars 13531 1726882434.97795: variable 'ansible_distribution_major_version' from source: facts 13531 1726882434.97806: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882434.97891: variable 'profile_stat' from source: set_fact 13531 1726882434.97902: Evaluated conditional (profile_stat.stat.exists): False 13531 1726882434.97905: when evaluation is False, skipping this task 13531 1726882434.97910: _execute() done 13531 1726882434.97913: dumping result to json 13531 1726882434.97916: done dumping result, returning 13531 1726882434.97918: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0e448fcc-3ce9-4fd9-519d-00000000067d] 13531 1726882434.97925: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000067d 13531 1726882434.98010: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000067d 13531 1726882434.98013: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13531 1726882434.98074: no more pending results, returning what we have 13531 1726882434.98077: results queue empty 13531 1726882434.98078: checking for any_errors_fatal 13531 1726882434.98084: done checking for any_errors_fatal 13531 1726882434.98084: checking for max_fail_percentage 13531 1726882434.98086: done checking for max_fail_percentage 13531 1726882434.98087: checking to see if all hosts have failed and the running result is not ok 13531 1726882434.98088: done checking to see if all hosts have failed 13531 1726882434.98088: getting the remaining hosts for this loop 13531 1726882434.98090: done getting the remaining hosts for this loop 13531 1726882434.98092: getting the next task for host managed_node2 13531 1726882434.98100: done getting next task for host managed_node2 13531 1726882434.98102: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13531 1726882434.98105: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882434.98108: getting variables 13531 1726882434.98110: in VariableManager get_vars() 13531 1726882434.98162: Calling all_inventory to load vars for managed_node2 13531 1726882434.98167: Calling groups_inventory to load vars for managed_node2 13531 1726882434.98169: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882434.98179: Calling all_plugins_play to load vars for managed_node2 13531 1726882434.98181: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882434.98184: Calling groups_plugins_play to load vars for managed_node2 13531 1726882434.99085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882435.00020: done with get_vars() 13531 1726882435.00035: done getting variables 13531 1726882435.00082: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882435.00161: variable 'profile' from source: include params 13531 1726882435.00165: variable 'item' from source: include params 13531 1726882435.00205: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:55 -0400 (0:00:00.033) 0:00:22.897 ****** 13531 1726882435.00227: entering _queue_task() for managed_node2/assert 13531 1726882435.00442: worker is 1 (out of 1 available) 13531 1726882435.00460: exiting _queue_task() for managed_node2/assert 13531 1726882435.00474: done queuing things up, now waiting for results queue to drain 13531 1726882435.00475: waiting for pending results... 13531 1726882435.00650: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.1' 13531 1726882435.00730: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000364 13531 1726882435.00744: variable 'ansible_search_path' from source: unknown 13531 1726882435.00749: variable 'ansible_search_path' from source: unknown 13531 1726882435.00784: calling self._execute() 13531 1726882435.00862: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.00868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.00877: variable 'omit' from source: magic vars 13531 1726882435.01346: variable 'ansible_distribution_major_version' from source: facts 13531 1726882435.01358: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882435.01365: variable 'omit' from source: magic vars 13531 1726882435.01396: variable 'omit' from source: magic vars 13531 1726882435.01468: variable 'profile' from source: include params 13531 1726882435.01471: variable 'item' from source: include params 13531 1726882435.01520: variable 'item' from source: include params 13531 1726882435.01533: variable 'omit' from source: magic vars 13531 1726882435.01574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882435.01600: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882435.01617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882435.01633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882435.01643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882435.01670: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882435.01673: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.01676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.01750: Set connection var ansible_pipelining to False 13531 1726882435.01753: Set connection var ansible_timeout to 10 13531 1726882435.01761: Set connection var ansible_shell_executable to /bin/sh 13531 1726882435.01768: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882435.01770: Set connection var ansible_connection to ssh 13531 1726882435.01773: Set connection var ansible_shell_type to sh 13531 1726882435.01792: variable 'ansible_shell_executable' from source: unknown 13531 1726882435.01795: variable 'ansible_connection' from source: unknown 13531 1726882435.01799: variable 'ansible_module_compression' from source: unknown 13531 1726882435.01802: variable 'ansible_shell_type' from source: unknown 13531 1726882435.01804: variable 'ansible_shell_executable' from source: unknown 13531 1726882435.01806: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.01808: variable 'ansible_pipelining' from source: unknown 13531 1726882435.01811: variable 'ansible_timeout' from source: unknown 13531 1726882435.01814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.01915: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882435.01925: variable 'omit' from source: magic vars 13531 1726882435.01932: starting attempt loop 13531 1726882435.01935: running the handler 13531 1726882435.02014: variable 'lsr_net_profile_exists' from source: set_fact 13531 1726882435.02017: Evaluated conditional (lsr_net_profile_exists): True 13531 1726882435.02023: handler run complete 13531 1726882435.02035: attempt loop complete, returning result 13531 1726882435.02038: _execute() done 13531 1726882435.02041: dumping result to json 13531 1726882435.02043: done dumping result, returning 13531 1726882435.02049: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.1' [0e448fcc-3ce9-4fd9-519d-000000000364] 13531 1726882435.02061: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000364 13531 1726882435.02140: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000364 13531 1726882435.02143: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882435.02202: no more pending results, returning what we have 13531 1726882435.02206: results queue empty 13531 1726882435.02207: checking for any_errors_fatal 13531 1726882435.02214: done checking for any_errors_fatal 13531 1726882435.02214: checking for max_fail_percentage 13531 1726882435.02217: done checking for max_fail_percentage 13531 1726882435.02218: checking to see if all hosts have failed and the running result is not ok 13531 1726882435.02218: done checking to see if all hosts have failed 13531 1726882435.02219: getting the remaining hosts for this loop 13531 1726882435.02221: done getting the remaining hosts for this loop 13531 1726882435.02224: getting the next task for host managed_node2 13531 1726882435.02229: done getting next task for host managed_node2 13531 1726882435.02231: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13531 1726882435.02234: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882435.02238: getting variables 13531 1726882435.02239: in VariableManager get_vars() 13531 1726882435.02298: Calling all_inventory to load vars for managed_node2 13531 1726882435.02301: Calling groups_inventory to load vars for managed_node2 13531 1726882435.02304: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882435.02313: Calling all_plugins_play to load vars for managed_node2 13531 1726882435.02315: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882435.02318: Calling groups_plugins_play to load vars for managed_node2 13531 1726882435.03439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882435.05273: done with get_vars() 13531 1726882435.05298: done getting variables 13531 1726882435.05357: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882435.05476: variable 'profile' from source: include params 13531 1726882435.05480: variable 'item' from source: include params 13531 1726882435.05538: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:55 -0400 (0:00:00.053) 0:00:22.950 ****** 13531 1726882435.05576: entering _queue_task() for managed_node2/assert 13531 1726882435.05889: worker is 1 (out of 1 available) 13531 1726882435.05900: exiting _queue_task() for managed_node2/assert 13531 1726882435.05913: done queuing things up, now waiting for results queue to drain 13531 1726882435.05915: waiting for pending results... 13531 1726882435.06198: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' 13531 1726882435.06308: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000365 13531 1726882435.06330: variable 'ansible_search_path' from source: unknown 13531 1726882435.06338: variable 'ansible_search_path' from source: unknown 13531 1726882435.06389: calling self._execute() 13531 1726882435.06494: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.06505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.06517: variable 'omit' from source: magic vars 13531 1726882435.06876: variable 'ansible_distribution_major_version' from source: facts 13531 1726882435.06898: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882435.06910: variable 'omit' from source: magic vars 13531 1726882435.06955: variable 'omit' from source: magic vars 13531 1726882435.07061: variable 'profile' from source: include params 13531 1726882435.07075: variable 'item' from source: include params 13531 1726882435.07145: variable 'item' from source: include params 13531 1726882435.07170: variable 'omit' from source: magic vars 13531 1726882435.07218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882435.07258: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882435.07287: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882435.07309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882435.07326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882435.07361: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882435.07367: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.07370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.07463: Set connection var ansible_pipelining to False 13531 1726882435.07468: Set connection var ansible_timeout to 10 13531 1726882435.07474: Set connection var ansible_shell_executable to /bin/sh 13531 1726882435.07479: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882435.07481: Set connection var ansible_connection to ssh 13531 1726882435.07484: Set connection var ansible_shell_type to sh 13531 1726882435.07503: variable 'ansible_shell_executable' from source: unknown 13531 1726882435.07506: variable 'ansible_connection' from source: unknown 13531 1726882435.07508: variable 'ansible_module_compression' from source: unknown 13531 1726882435.07511: variable 'ansible_shell_type' from source: unknown 13531 1726882435.07513: variable 'ansible_shell_executable' from source: unknown 13531 1726882435.07515: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.07518: variable 'ansible_pipelining' from source: unknown 13531 1726882435.07521: variable 'ansible_timeout' from source: unknown 13531 1726882435.07524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.07631: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882435.07641: variable 'omit' from source: magic vars 13531 1726882435.07646: starting attempt loop 13531 1726882435.07649: running the handler 13531 1726882435.07726: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13531 1726882435.07729: Evaluated conditional (lsr_net_profile_ansible_managed): True 13531 1726882435.07735: handler run complete 13531 1726882435.07746: attempt loop complete, returning result 13531 1726882435.07749: _execute() done 13531 1726882435.07751: dumping result to json 13531 1726882435.07757: done dumping result, returning 13531 1726882435.07761: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0e448fcc-3ce9-4fd9-519d-000000000365] 13531 1726882435.07770: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000365 13531 1726882435.07851: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000365 13531 1726882435.07856: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882435.07920: no more pending results, returning what we have 13531 1726882435.07923: results queue empty 13531 1726882435.07924: checking for any_errors_fatal 13531 1726882435.07930: done checking for any_errors_fatal 13531 1726882435.07931: checking for max_fail_percentage 13531 1726882435.07933: done checking for max_fail_percentage 13531 1726882435.07934: checking to see if all hosts have failed and the running result is not ok 13531 1726882435.07935: done checking to see if all hosts have failed 13531 1726882435.07936: getting the remaining hosts for this loop 13531 1726882435.07937: done getting the remaining hosts for this loop 13531 1726882435.07941: getting the next task for host managed_node2 13531 1726882435.07946: done getting next task for host managed_node2 13531 1726882435.07949: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13531 1726882435.07952: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882435.07957: getting variables 13531 1726882435.07959: in VariableManager get_vars() 13531 1726882435.08007: Calling all_inventory to load vars for managed_node2 13531 1726882435.08010: Calling groups_inventory to load vars for managed_node2 13531 1726882435.08012: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882435.08022: Calling all_plugins_play to load vars for managed_node2 13531 1726882435.08024: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882435.08027: Calling groups_plugins_play to load vars for managed_node2 13531 1726882435.08838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882435.10268: done with get_vars() 13531 1726882435.10291: done getting variables 13531 1726882435.10349: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882435.10460: variable 'profile' from source: include params 13531 1726882435.10465: variable 'item' from source: include params 13531 1726882435.10524: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:55 -0400 (0:00:00.049) 0:00:23.000 ****** 13531 1726882435.10560: entering _queue_task() for managed_node2/assert 13531 1726882435.10856: worker is 1 (out of 1 available) 13531 1726882435.10869: exiting _queue_task() for managed_node2/assert 13531 1726882435.10882: done queuing things up, now waiting for results queue to drain 13531 1726882435.10883: waiting for pending results... 13531 1726882435.11479: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.1 13531 1726882435.11568: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000366 13531 1726882435.11582: variable 'ansible_search_path' from source: unknown 13531 1726882435.11586: variable 'ansible_search_path' from source: unknown 13531 1726882435.11623: calling self._execute() 13531 1726882435.11723: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.11727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.11737: variable 'omit' from source: magic vars 13531 1726882435.12117: variable 'ansible_distribution_major_version' from source: facts 13531 1726882435.12130: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882435.12136: variable 'omit' from source: magic vars 13531 1726882435.12184: variable 'omit' from source: magic vars 13531 1726882435.12290: variable 'profile' from source: include params 13531 1726882435.12295: variable 'item' from source: include params 13531 1726882435.12362: variable 'item' from source: include params 13531 1726882435.12381: variable 'omit' from source: magic vars 13531 1726882435.12433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882435.12468: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882435.12488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882435.12509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882435.12525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882435.12557: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882435.12560: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.12565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.12674: Set connection var ansible_pipelining to False 13531 1726882435.12680: Set connection var ansible_timeout to 10 13531 1726882435.12686: Set connection var ansible_shell_executable to /bin/sh 13531 1726882435.12691: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882435.12694: Set connection var ansible_connection to ssh 13531 1726882435.12696: Set connection var ansible_shell_type to sh 13531 1726882435.12728: variable 'ansible_shell_executable' from source: unknown 13531 1726882435.12735: variable 'ansible_connection' from source: unknown 13531 1726882435.12738: variable 'ansible_module_compression' from source: unknown 13531 1726882435.12740: variable 'ansible_shell_type' from source: unknown 13531 1726882435.12743: variable 'ansible_shell_executable' from source: unknown 13531 1726882435.12745: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.12749: variable 'ansible_pipelining' from source: unknown 13531 1726882435.12752: variable 'ansible_timeout' from source: unknown 13531 1726882435.12757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.12896: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882435.12907: variable 'omit' from source: magic vars 13531 1726882435.12913: starting attempt loop 13531 1726882435.12916: running the handler 13531 1726882435.13027: variable 'lsr_net_profile_fingerprint' from source: set_fact 13531 1726882435.13031: Evaluated conditional (lsr_net_profile_fingerprint): True 13531 1726882435.13037: handler run complete 13531 1726882435.13061: attempt loop complete, returning result 13531 1726882435.13068: _execute() done 13531 1726882435.13071: dumping result to json 13531 1726882435.13073: done dumping result, returning 13531 1726882435.13080: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.1 [0e448fcc-3ce9-4fd9-519d-000000000366] 13531 1726882435.13087: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000366 13531 1726882435.13182: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000366 13531 1726882435.13186: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882435.13234: no more pending results, returning what we have 13531 1726882435.13237: results queue empty 13531 1726882435.13238: checking for any_errors_fatal 13531 1726882435.13245: done checking for any_errors_fatal 13531 1726882435.13246: checking for max_fail_percentage 13531 1726882435.13248: done checking for max_fail_percentage 13531 1726882435.13249: checking to see if all hosts have failed and the running result is not ok 13531 1726882435.13250: done checking to see if all hosts have failed 13531 1726882435.13251: getting the remaining hosts for this loop 13531 1726882435.13252: done getting the remaining hosts for this loop 13531 1726882435.13255: getting the next task for host managed_node2 13531 1726882435.13263: done getting next task for host managed_node2 13531 1726882435.13268: ^ task is: TASK: ** TEST check polling interval 13531 1726882435.13271: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882435.13275: getting variables 13531 1726882435.13277: in VariableManager get_vars() 13531 1726882435.13343: Calling all_inventory to load vars for managed_node2 13531 1726882435.13347: Calling groups_inventory to load vars for managed_node2 13531 1726882435.13350: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882435.13362: Calling all_plugins_play to load vars for managed_node2 13531 1726882435.13367: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882435.13371: Calling groups_plugins_play to load vars for managed_node2 13531 1726882435.15333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882435.17380: done with get_vars() 13531 1726882435.17467: done getting variables 13531 1726882435.17529: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:75 Friday 20 September 2024 21:33:55 -0400 (0:00:00.069) 0:00:23.070 ****** 13531 1726882435.17558: entering _queue_task() for managed_node2/command 13531 1726882435.17914: worker is 1 (out of 1 available) 13531 1726882435.17927: exiting _queue_task() for managed_node2/command 13531 1726882435.17940: done queuing things up, now waiting for results queue to drain 13531 1726882435.17941: waiting for pending results... 13531 1726882435.18894: running TaskExecutor() for managed_node2/TASK: ** TEST check polling interval 13531 1726882435.19001: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000071 13531 1726882435.19029: variable 'ansible_search_path' from source: unknown 13531 1726882435.19077: calling self._execute() 13531 1726882435.19178: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.19189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.19201: variable 'omit' from source: magic vars 13531 1726882435.19869: variable 'ansible_distribution_major_version' from source: facts 13531 1726882435.19874: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882435.19876: variable 'omit' from source: magic vars 13531 1726882435.19879: variable 'omit' from source: magic vars 13531 1726882435.19881: variable 'controller_device' from source: play vars 13531 1726882435.19884: variable 'omit' from source: magic vars 13531 1726882435.19886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882435.19889: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882435.19891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882435.19893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882435.19895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882435.19930: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882435.19933: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.19936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.20050: Set connection var ansible_pipelining to False 13531 1726882435.20058: Set connection var ansible_timeout to 10 13531 1726882435.20061: Set connection var ansible_shell_executable to /bin/sh 13531 1726882435.20067: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882435.20072: Set connection var ansible_connection to ssh 13531 1726882435.20074: Set connection var ansible_shell_type to sh 13531 1726882435.20101: variable 'ansible_shell_executable' from source: unknown 13531 1726882435.20104: variable 'ansible_connection' from source: unknown 13531 1726882435.20107: variable 'ansible_module_compression' from source: unknown 13531 1726882435.20109: variable 'ansible_shell_type' from source: unknown 13531 1726882435.20112: variable 'ansible_shell_executable' from source: unknown 13531 1726882435.20114: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.20116: variable 'ansible_pipelining' from source: unknown 13531 1726882435.20120: variable 'ansible_timeout' from source: unknown 13531 1726882435.20122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.20273: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882435.20283: variable 'omit' from source: magic vars 13531 1726882435.20289: starting attempt loop 13531 1726882435.20293: running the handler 13531 1726882435.20308: _low_level_execute_command(): starting 13531 1726882435.20314: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882435.21230: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.21238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.21281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882435.21284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.21300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 13531 1726882435.21305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.21317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882435.21323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.21416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882435.21435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882435.21618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882435.23241: stdout chunk (state=3): >>>/root <<< 13531 1726882435.23411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882435.23414: stderr chunk (state=3): >>><<< 13531 1726882435.23419: stdout chunk (state=3): >>><<< 13531 1726882435.23448: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882435.23469: _low_level_execute_command(): starting 13531 1726882435.23474: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882435.2344728-14566-276265807573715 `" && echo ansible-tmp-1726882435.2344728-14566-276265807573715="` echo /root/.ansible/tmp/ansible-tmp-1726882435.2344728-14566-276265807573715 `" ) && sleep 0' 13531 1726882435.24120: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882435.24128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.24138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.24150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.24190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.24198: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882435.24210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.24222: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882435.24229: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882435.24236: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882435.24243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.24251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.24267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.24275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.24287: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882435.24308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.24432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882435.24468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882435.24472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882435.24792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882435.26478: stdout chunk (state=3): >>>ansible-tmp-1726882435.2344728-14566-276265807573715=/root/.ansible/tmp/ansible-tmp-1726882435.2344728-14566-276265807573715 <<< 13531 1726882435.26666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882435.26670: stdout chunk (state=3): >>><<< 13531 1726882435.26676: stderr chunk (state=3): >>><<< 13531 1726882435.26693: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882435.2344728-14566-276265807573715=/root/.ansible/tmp/ansible-tmp-1726882435.2344728-14566-276265807573715 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882435.26724: variable 'ansible_module_compression' from source: unknown 13531 1726882435.26849: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13531 1726882435.26882: variable 'ansible_facts' from source: unknown 13531 1726882435.27074: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882435.2344728-14566-276265807573715/AnsiballZ_command.py 13531 1726882435.27183: Sending initial data 13531 1726882435.27186: Sent initial data (156 bytes) 13531 1726882435.28017: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882435.28026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.28038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.28067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.28122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.28129: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882435.28139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.28155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882435.28158: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882435.28196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882435.28203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.28212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.28223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.28230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.28237: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882435.28246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.28347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882435.28363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882435.28385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882435.29275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882435.30324: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882435.30420: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882435.30521: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpzv0ve6w8 /root/.ansible/tmp/ansible-tmp-1726882435.2344728-14566-276265807573715/AnsiballZ_command.py <<< 13531 1726882435.30625: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882435.31994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882435.32197: stderr chunk (state=3): >>><<< 13531 1726882435.32200: stdout chunk (state=3): >>><<< 13531 1726882435.32219: done transferring module to remote 13531 1726882435.32231: _low_level_execute_command(): starting 13531 1726882435.32236: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882435.2344728-14566-276265807573715/ /root/.ansible/tmp/ansible-tmp-1726882435.2344728-14566-276265807573715/AnsiballZ_command.py && sleep 0' 13531 1726882435.33196: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882435.33204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.33221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.33258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.33262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 13531 1726882435.33278: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.33284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882435.33296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.33372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882435.33385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882435.33391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882435.33518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882435.35347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882435.35350: stderr chunk (state=3): >>><<< 13531 1726882435.35360: stdout chunk (state=3): >>><<< 13531 1726882435.35376: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882435.35379: _low_level_execute_command(): starting 13531 1726882435.35384: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882435.2344728-14566-276265807573715/AnsiballZ_command.py && sleep 0' 13531 1726882435.35975: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882435.35984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.35994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.36009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.36046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.36056: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882435.36063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.36077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882435.36084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882435.36090: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882435.36098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.36106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.36118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.36127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.36131: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882435.36142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.36213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882435.36226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882435.36231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882435.36410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882435.50012: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 21:33:55.494526", "end": "2024-09-20 21:33:55.498091", "delta": "0:00:00.003565", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882435.51250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882435.51257: stderr chunk (state=3): >>><<< 13531 1726882435.51260: stdout chunk (state=3): >>><<< 13531 1726882435.51279: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 21:33:55.494526", "end": "2024-09-20 21:33:55.498091", "delta": "0:00:00.003565", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882435.51316: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882435.2344728-14566-276265807573715/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882435.51320: _low_level_execute_command(): starting 13531 1726882435.51326: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882435.2344728-14566-276265807573715/ > /dev/null 2>&1 && sleep 0' 13531 1726882435.52117: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.52121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.52175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.52180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.52194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.52200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.52283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882435.52297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882435.52303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882435.52430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882435.54348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882435.54442: stderr chunk (state=3): >>><<< 13531 1726882435.54445: stdout chunk (state=3): >>><<< 13531 1726882435.54460: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882435.54466: handler run complete 13531 1726882435.54493: Evaluated conditional (False): False 13531 1726882435.54644: variable 'result' from source: unknown 13531 1726882435.54662: Evaluated conditional ('110' in result.stdout): True 13531 1726882435.54684: attempt loop complete, returning result 13531 1726882435.54687: _execute() done 13531 1726882435.54689: dumping result to json 13531 1726882435.54692: done dumping result, returning 13531 1726882435.54700: done running TaskExecutor() for managed_node2/TASK: ** TEST check polling interval [0e448fcc-3ce9-4fd9-519d-000000000071] 13531 1726882435.54708: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000071 13531 1726882435.54818: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000071 13531 1726882435.54821: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003565", "end": "2024-09-20 21:33:55.498091", "rc": 0, "start": "2024-09-20 21:33:55.494526" } STDOUT: MII Polling Interval (ms): 110 13531 1726882435.54898: no more pending results, returning what we have 13531 1726882435.54902: results queue empty 13531 1726882435.54903: checking for any_errors_fatal 13531 1726882435.54909: done checking for any_errors_fatal 13531 1726882435.54909: checking for max_fail_percentage 13531 1726882435.54911: done checking for max_fail_percentage 13531 1726882435.54912: checking to see if all hosts have failed and the running result is not ok 13531 1726882435.54913: done checking to see if all hosts have failed 13531 1726882435.54913: getting the remaining hosts for this loop 13531 1726882435.54915: done getting the remaining hosts for this loop 13531 1726882435.54918: getting the next task for host managed_node2 13531 1726882435.54923: done getting next task for host managed_node2 13531 1726882435.54925: ^ task is: TASK: ** TEST check IPv4 13531 1726882435.54927: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882435.54932: getting variables 13531 1726882435.54933: in VariableManager get_vars() 13531 1726882435.54983: Calling all_inventory to load vars for managed_node2 13531 1726882435.54986: Calling groups_inventory to load vars for managed_node2 13531 1726882435.54988: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882435.54997: Calling all_plugins_play to load vars for managed_node2 13531 1726882435.55000: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882435.55002: Calling groups_plugins_play to load vars for managed_node2 13531 1726882435.57924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882435.60029: done with get_vars() 13531 1726882435.60055: done getting variables 13531 1726882435.60117: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:80 Friday 20 September 2024 21:33:55 -0400 (0:00:00.425) 0:00:23.496 ****** 13531 1726882435.60145: entering _queue_task() for managed_node2/command 13531 1726882435.60474: worker is 1 (out of 1 available) 13531 1726882435.60485: exiting _queue_task() for managed_node2/command 13531 1726882435.60498: done queuing things up, now waiting for results queue to drain 13531 1726882435.60499: waiting for pending results... 13531 1726882435.60792: running TaskExecutor() for managed_node2/TASK: ** TEST check IPv4 13531 1726882435.60892: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000072 13531 1726882435.60907: variable 'ansible_search_path' from source: unknown 13531 1726882435.60948: calling self._execute() 13531 1726882435.61081: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.61088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.61098: variable 'omit' from source: magic vars 13531 1726882435.61530: variable 'ansible_distribution_major_version' from source: facts 13531 1726882435.61542: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882435.61548: variable 'omit' from source: magic vars 13531 1726882435.61575: variable 'omit' from source: magic vars 13531 1726882435.61679: variable 'controller_device' from source: play vars 13531 1726882435.61698: variable 'omit' from source: magic vars 13531 1726882435.61749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882435.61787: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882435.61807: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882435.61832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882435.61867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882435.61876: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882435.61881: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.61883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.61996: Set connection var ansible_pipelining to False 13531 1726882435.62008: Set connection var ansible_timeout to 10 13531 1726882435.62011: Set connection var ansible_shell_executable to /bin/sh 13531 1726882435.62013: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882435.62015: Set connection var ansible_connection to ssh 13531 1726882435.62018: Set connection var ansible_shell_type to sh 13531 1726882435.62049: variable 'ansible_shell_executable' from source: unknown 13531 1726882435.62052: variable 'ansible_connection' from source: unknown 13531 1726882435.62058: variable 'ansible_module_compression' from source: unknown 13531 1726882435.62061: variable 'ansible_shell_type' from source: unknown 13531 1726882435.62065: variable 'ansible_shell_executable' from source: unknown 13531 1726882435.62067: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882435.62069: variable 'ansible_pipelining' from source: unknown 13531 1726882435.62071: variable 'ansible_timeout' from source: unknown 13531 1726882435.62074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882435.62221: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882435.62233: variable 'omit' from source: magic vars 13531 1726882435.62239: starting attempt loop 13531 1726882435.62241: running the handler 13531 1726882435.62261: _low_level_execute_command(): starting 13531 1726882435.62271: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882435.63043: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882435.63058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.63069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.63085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.63128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.63135: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882435.63145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.63158: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882435.63167: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882435.63175: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882435.63183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.63192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.63204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.63212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.63219: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882435.63230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.63312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882435.63345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882435.63358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882435.63846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882435.65146: stdout chunk (state=3): >>>/root <<< 13531 1726882435.65288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882435.65345: stderr chunk (state=3): >>><<< 13531 1726882435.65348: stdout chunk (state=3): >>><<< 13531 1726882435.65365: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882435.65379: _low_level_execute_command(): starting 13531 1726882435.65386: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882435.6536634-14588-181612992633222 `" && echo ansible-tmp-1726882435.6536634-14588-181612992633222="` echo /root/.ansible/tmp/ansible-tmp-1726882435.6536634-14588-181612992633222 `" ) && sleep 0' 13531 1726882435.65983: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882435.65990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.66000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.66014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.66057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.66060: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882435.66063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.66080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882435.66090: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882435.66093: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882435.66102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.66112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.66123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.66130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.66136: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882435.66145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.66222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882435.66229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882435.66232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882435.66373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882435.68293: stdout chunk (state=3): >>>ansible-tmp-1726882435.6536634-14588-181612992633222=/root/.ansible/tmp/ansible-tmp-1726882435.6536634-14588-181612992633222 <<< 13531 1726882435.68349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882435.68433: stderr chunk (state=3): >>><<< 13531 1726882435.68444: stdout chunk (state=3): >>><<< 13531 1726882435.68759: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882435.6536634-14588-181612992633222=/root/.ansible/tmp/ansible-tmp-1726882435.6536634-14588-181612992633222 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882435.68768: variable 'ansible_module_compression' from source: unknown 13531 1726882435.68771: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13531 1726882435.68773: variable 'ansible_facts' from source: unknown 13531 1726882435.68775: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882435.6536634-14588-181612992633222/AnsiballZ_command.py 13531 1726882435.68834: Sending initial data 13531 1726882435.68837: Sent initial data (156 bytes) 13531 1726882435.70441: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882435.70462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.70490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.70510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.70550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.70572: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882435.70595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.70613: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882435.70624: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882435.70634: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882435.70645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.70657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.70680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.70692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.70702: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882435.70715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.70794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882435.70815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882435.70830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882435.70957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882435.72734: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882435.72824: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882435.72926: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpelepi9cx /root/.ansible/tmp/ansible-tmp-1726882435.6536634-14588-181612992633222/AnsiballZ_command.py <<< 13531 1726882435.73021: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882435.74378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882435.74635: stderr chunk (state=3): >>><<< 13531 1726882435.74639: stdout chunk (state=3): >>><<< 13531 1726882435.74641: done transferring module to remote 13531 1726882435.74647: _low_level_execute_command(): starting 13531 1726882435.74650: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882435.6536634-14588-181612992633222/ /root/.ansible/tmp/ansible-tmp-1726882435.6536634-14588-181612992633222/AnsiballZ_command.py && sleep 0' 13531 1726882435.75215: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882435.75231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.75245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.75265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.75307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.75321: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882435.75336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.75354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882435.75368: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882435.75380: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882435.75393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.75407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.75422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.75436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.75449: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882435.75466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.75539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882435.75562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882435.75582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882435.75712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882435.77505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882435.77596: stderr chunk (state=3): >>><<< 13531 1726882435.77617: stdout chunk (state=3): >>><<< 13531 1726882435.77669: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882435.77672: _low_level_execute_command(): starting 13531 1726882435.77675: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882435.6536634-14588-181612992633222/AnsiballZ_command.py && sleep 0' 13531 1726882435.78489: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882435.78515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.78535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.78554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.78600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.78613: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882435.78637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.78656: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882435.78674: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882435.78686: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882435.78699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.78714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.78737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.78754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882435.78768: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882435.78784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.78883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882435.78915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882435.78934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882435.79089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882435.92556: stdout chunk (state=3): >>> {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.223/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 234sec preferred_lft 234sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:55.920063", "end": "2024-09-20 21:33:55.923633", "delta": "0:00:00.003570", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882435.93821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882435.93825: stdout chunk (state=3): >>><<< 13531 1726882435.93831: stderr chunk (state=3): >>><<< 13531 1726882435.93850: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.223/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 234sec preferred_lft 234sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:55.920063", "end": "2024-09-20 21:33:55.923633", "delta": "0:00:00.003570", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882435.93899: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882435.6536634-14588-181612992633222/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882435.93905: _low_level_execute_command(): starting 13531 1726882435.93910: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882435.6536634-14588-181612992633222/ > /dev/null 2>&1 && sleep 0' 13531 1726882435.95229: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.95233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882435.95324: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.95328: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882435.95343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882435.95347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882435.95439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882435.95445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882435.95589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882435.95802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882435.97631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882435.97659: stderr chunk (state=3): >>><<< 13531 1726882435.97663: stdout chunk (state=3): >>><<< 13531 1726882435.97769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882435.97773: handler run complete 13531 1726882435.97775: Evaluated conditional (False): False 13531 1726882435.97969: variable 'result' from source: set_fact 13531 1726882435.97972: Evaluated conditional ('192.0.2' in result.stdout): True 13531 1726882435.97974: attempt loop complete, returning result 13531 1726882435.97976: _execute() done 13531 1726882435.97978: dumping result to json 13531 1726882435.97979: done dumping result, returning 13531 1726882435.97981: done running TaskExecutor() for managed_node2/TASK: ** TEST check IPv4 [0e448fcc-3ce9-4fd9-519d-000000000072] 13531 1726882435.97983: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000072 ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003570", "end": "2024-09-20 21:33:55.923633", "rc": 0, "start": "2024-09-20 21:33:55.920063" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.223/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 234sec preferred_lft 234sec 13531 1726882435.98166: no more pending results, returning what we have 13531 1726882435.98169: results queue empty 13531 1726882435.98170: checking for any_errors_fatal 13531 1726882435.98180: done checking for any_errors_fatal 13531 1726882435.98181: checking for max_fail_percentage 13531 1726882435.98183: done checking for max_fail_percentage 13531 1726882435.98184: checking to see if all hosts have failed and the running result is not ok 13531 1726882435.98185: done checking to see if all hosts have failed 13531 1726882435.98186: getting the remaining hosts for this loop 13531 1726882435.98187: done getting the remaining hosts for this loop 13531 1726882435.98191: getting the next task for host managed_node2 13531 1726882435.98198: done getting next task for host managed_node2 13531 1726882435.98201: ^ task is: TASK: ** TEST check IPv6 13531 1726882435.98204: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882435.98208: getting variables 13531 1726882435.98210: in VariableManager get_vars() 13531 1726882435.98271: Calling all_inventory to load vars for managed_node2 13531 1726882435.98275: Calling groups_inventory to load vars for managed_node2 13531 1726882435.98277: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882435.98290: Calling all_plugins_play to load vars for managed_node2 13531 1726882435.98293: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882435.98296: Calling groups_plugins_play to load vars for managed_node2 13531 1726882435.99371: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000072 13531 1726882435.99375: WORKER PROCESS EXITING 13531 1726882436.00735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882436.02408: done with get_vars() 13531 1726882436.02438: done getting variables 13531 1726882436.02502: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:87 Friday 20 September 2024 21:33:56 -0400 (0:00:00.423) 0:00:23.920 ****** 13531 1726882436.02532: entering _queue_task() for managed_node2/command 13531 1726882436.02970: worker is 1 (out of 1 available) 13531 1726882436.02983: exiting _queue_task() for managed_node2/command 13531 1726882436.02996: done queuing things up, now waiting for results queue to drain 13531 1726882436.02998: waiting for pending results... 13531 1726882436.03287: running TaskExecutor() for managed_node2/TASK: ** TEST check IPv6 13531 1726882436.03394: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000073 13531 1726882436.03415: variable 'ansible_search_path' from source: unknown 13531 1726882436.03469: calling self._execute() 13531 1726882436.03681: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882436.03693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882436.03708: variable 'omit' from source: magic vars 13531 1726882436.04445: variable 'ansible_distribution_major_version' from source: facts 13531 1726882436.04466: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882436.04477: variable 'omit' from source: magic vars 13531 1726882436.04499: variable 'omit' from source: magic vars 13531 1726882436.04595: variable 'controller_device' from source: play vars 13531 1726882436.04618: variable 'omit' from source: magic vars 13531 1726882436.04673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882436.04707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882436.04734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882436.04759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882436.04779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882436.04810: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882436.04818: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882436.04824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882436.04935: Set connection var ansible_pipelining to False 13531 1726882436.04947: Set connection var ansible_timeout to 10 13531 1726882436.04961: Set connection var ansible_shell_executable to /bin/sh 13531 1726882436.04977: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882436.04984: Set connection var ansible_connection to ssh 13531 1726882436.04991: Set connection var ansible_shell_type to sh 13531 1726882436.05023: variable 'ansible_shell_executable' from source: unknown 13531 1726882436.05033: variable 'ansible_connection' from source: unknown 13531 1726882436.05042: variable 'ansible_module_compression' from source: unknown 13531 1726882436.05050: variable 'ansible_shell_type' from source: unknown 13531 1726882436.05058: variable 'ansible_shell_executable' from source: unknown 13531 1726882436.05068: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882436.05080: variable 'ansible_pipelining' from source: unknown 13531 1726882436.05087: variable 'ansible_timeout' from source: unknown 13531 1726882436.05093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882436.05243: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882436.05263: variable 'omit' from source: magic vars 13531 1726882436.05275: starting attempt loop 13531 1726882436.05281: running the handler 13531 1726882436.05305: _low_level_execute_command(): starting 13531 1726882436.05317: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882436.06811: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882436.06818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882436.06847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13531 1726882436.06852: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882436.06855: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882436.06927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882436.06933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882436.06936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882436.07046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882436.08711: stdout chunk (state=3): >>>/root <<< 13531 1726882436.08878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882436.08906: stderr chunk (state=3): >>><<< 13531 1726882436.08913: stdout chunk (state=3): >>><<< 13531 1726882436.09036: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882436.09040: _low_level_execute_command(): starting 13531 1726882436.09043: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882436.089395-14615-104275410391328 `" && echo ansible-tmp-1726882436.089395-14615-104275410391328="` echo /root/.ansible/tmp/ansible-tmp-1726882436.089395-14615-104275410391328 `" ) && sleep 0' 13531 1726882436.10237: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882436.10241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882436.10284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882436.10288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882436.10297: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882436.10299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882436.10370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882436.10374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882436.10376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882436.10492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882436.12402: stdout chunk (state=3): >>>ansible-tmp-1726882436.089395-14615-104275410391328=/root/.ansible/tmp/ansible-tmp-1726882436.089395-14615-104275410391328 <<< 13531 1726882436.12507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882436.12591: stderr chunk (state=3): >>><<< 13531 1726882436.12594: stdout chunk (state=3): >>><<< 13531 1726882436.12770: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882436.089395-14615-104275410391328=/root/.ansible/tmp/ansible-tmp-1726882436.089395-14615-104275410391328 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882436.12773: variable 'ansible_module_compression' from source: unknown 13531 1726882436.12776: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13531 1726882436.12778: variable 'ansible_facts' from source: unknown 13531 1726882436.12836: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882436.089395-14615-104275410391328/AnsiballZ_command.py 13531 1726882436.12995: Sending initial data 13531 1726882436.12998: Sent initial data (155 bytes) 13531 1726882436.14672: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882436.14981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882436.15025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882436.15044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882436.15119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882436.15146: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882436.15161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882436.15182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882436.15194: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882436.15207: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882436.15221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882436.15235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882436.15252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882436.15267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882436.15280: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882436.15294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882436.15381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882436.15404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882436.15421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882436.15560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882436.17340: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882436.17432: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882436.17533: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmp7xhamzca /root/.ansible/tmp/ansible-tmp-1726882436.089395-14615-104275410391328/AnsiballZ_command.py <<< 13531 1726882436.17627: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882436.19082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882436.19271: stderr chunk (state=3): >>><<< 13531 1726882436.19275: stdout chunk (state=3): >>><<< 13531 1726882436.19277: done transferring module to remote 13531 1726882436.19279: _low_level_execute_command(): starting 13531 1726882436.19281: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882436.089395-14615-104275410391328/ /root/.ansible/tmp/ansible-tmp-1726882436.089395-14615-104275410391328/AnsiballZ_command.py && sleep 0' 13531 1726882436.19912: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882436.19928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882436.19948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882436.19972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882436.20017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882436.20029: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882436.20044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882436.20070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882436.20083: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882436.20099: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882436.20111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882436.20124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882436.20139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882436.20151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882436.20167: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882436.20184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882436.20265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882436.20302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882436.20339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882436.20438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882436.22223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882436.22311: stderr chunk (state=3): >>><<< 13531 1726882436.22326: stdout chunk (state=3): >>><<< 13531 1726882436.22436: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882436.22439: _low_level_execute_command(): starting 13531 1726882436.22442: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882436.089395-14615-104275410391328/AnsiballZ_command.py && sleep 0' 13531 1726882436.23066: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882436.23081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882436.23102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882436.23121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882436.23169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882436.23182: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882436.23197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882436.23219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882436.23232: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882436.23244: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882436.23260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882436.23276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882436.23292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882436.23304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882436.23320: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882436.23338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882436.23417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882436.23458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882436.23496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882436.23602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882436.37049: stdout chunk (state=3): >>> {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1ab/128 scope global dynamic noprefixroute \n valid_lft 235sec preferred_lft 235sec\n inet6 2001:db8::61bd:7fbe:dc15:723e/64 scope global dynamic noprefixroute \n valid_lft 1795sec preferred_lft 1795sec\n inet6 fe80::5cb0:c9a4:6967:876b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:56.365067", "end": "2024-09-20 21:33:56.368564", "delta": "0:00:00.003497", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882436.38333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882436.38337: stdout chunk (state=3): >>><<< 13531 1726882436.38339: stderr chunk (state=3): >>><<< 13531 1726882436.38489: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1ab/128 scope global dynamic noprefixroute \n valid_lft 235sec preferred_lft 235sec\n inet6 2001:db8::61bd:7fbe:dc15:723e/64 scope global dynamic noprefixroute \n valid_lft 1795sec preferred_lft 1795sec\n inet6 fe80::5cb0:c9a4:6967:876b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:56.365067", "end": "2024-09-20 21:33:56.368564", "delta": "0:00:00.003497", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882436.38498: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882436.089395-14615-104275410391328/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882436.38501: _low_level_execute_command(): starting 13531 1726882436.38504: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882436.089395-14615-104275410391328/ > /dev/null 2>&1 && sleep 0' 13531 1726882436.39690: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882436.39695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882436.39718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 13531 1726882436.39721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882436.39795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882436.39798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882436.39801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882436.39911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882436.41877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882436.41881: stdout chunk (state=3): >>><<< 13531 1726882436.41884: stderr chunk (state=3): >>><<< 13531 1726882436.42172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882436.42176: handler run complete 13531 1726882436.42178: Evaluated conditional (False): False 13531 1726882436.42180: variable 'result' from source: set_fact 13531 1726882436.42182: Evaluated conditional ('2001' in result.stdout): True 13531 1726882436.42184: attempt loop complete, returning result 13531 1726882436.42186: _execute() done 13531 1726882436.42187: dumping result to json 13531 1726882436.42189: done dumping result, returning 13531 1726882436.42191: done running TaskExecutor() for managed_node2/TASK: ** TEST check IPv6 [0e448fcc-3ce9-4fd9-519d-000000000073] 13531 1726882436.42193: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000073 ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003497", "end": "2024-09-20 21:33:56.368564", "rc": 0, "start": "2024-09-20 21:33:56.365067" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::1ab/128 scope global dynamic noprefixroute valid_lft 235sec preferred_lft 235sec inet6 2001:db8::61bd:7fbe:dc15:723e/64 scope global dynamic noprefixroute valid_lft 1795sec preferred_lft 1795sec inet6 fe80::5cb0:c9a4:6967:876b/64 scope link noprefixroute valid_lft forever preferred_lft forever 13531 1726882436.42374: no more pending results, returning what we have 13531 1726882436.42378: results queue empty 13531 1726882436.42379: checking for any_errors_fatal 13531 1726882436.42390: done checking for any_errors_fatal 13531 1726882436.42390: checking for max_fail_percentage 13531 1726882436.42392: done checking for max_fail_percentage 13531 1726882436.42393: checking to see if all hosts have failed and the running result is not ok 13531 1726882436.42394: done checking to see if all hosts have failed 13531 1726882436.42395: getting the remaining hosts for this loop 13531 1726882436.42397: done getting the remaining hosts for this loop 13531 1726882436.42400: getting the next task for host managed_node2 13531 1726882436.42407: done getting next task for host managed_node2 13531 1726882436.42413: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13531 1726882436.42416: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882436.42435: getting variables 13531 1726882436.42437: in VariableManager get_vars() 13531 1726882436.42495: Calling all_inventory to load vars for managed_node2 13531 1726882436.42498: Calling groups_inventory to load vars for managed_node2 13531 1726882436.42501: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882436.42513: Calling all_plugins_play to load vars for managed_node2 13531 1726882436.42515: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882436.42518: Calling groups_plugins_play to load vars for managed_node2 13531 1726882436.43302: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000073 13531 1726882436.43310: WORKER PROCESS EXITING 13531 1726882436.44401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882436.46149: done with get_vars() 13531 1726882436.46183: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:56 -0400 (0:00:00.437) 0:00:24.358 ****** 13531 1726882436.46300: entering _queue_task() for managed_node2/include_tasks 13531 1726882436.46652: worker is 1 (out of 1 available) 13531 1726882436.46668: exiting _queue_task() for managed_node2/include_tasks 13531 1726882436.46681: done queuing things up, now waiting for results queue to drain 13531 1726882436.46682: waiting for pending results... 13531 1726882436.46984: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13531 1726882436.47135: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000007b 13531 1726882436.47158: variable 'ansible_search_path' from source: unknown 13531 1726882436.47169: variable 'ansible_search_path' from source: unknown 13531 1726882436.47212: calling self._execute() 13531 1726882436.47310: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882436.47319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882436.47332: variable 'omit' from source: magic vars 13531 1726882436.48014: variable 'ansible_distribution_major_version' from source: facts 13531 1726882436.48032: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882436.48108: _execute() done 13531 1726882436.48117: dumping result to json 13531 1726882436.48124: done dumping result, returning 13531 1726882436.48136: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4fd9-519d-00000000007b] 13531 1726882436.48147: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000007b 13531 1726882436.48302: no more pending results, returning what we have 13531 1726882436.48307: in VariableManager get_vars() 13531 1726882436.48377: Calling all_inventory to load vars for managed_node2 13531 1726882436.48380: Calling groups_inventory to load vars for managed_node2 13531 1726882436.48383: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882436.48397: Calling all_plugins_play to load vars for managed_node2 13531 1726882436.48400: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882436.48403: Calling groups_plugins_play to load vars for managed_node2 13531 1726882436.50191: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000007b 13531 1726882436.50196: WORKER PROCESS EXITING 13531 1726882436.50559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882436.52477: done with get_vars() 13531 1726882436.52498: variable 'ansible_search_path' from source: unknown 13531 1726882436.52499: variable 'ansible_search_path' from source: unknown 13531 1726882436.52541: we have included files to process 13531 1726882436.52542: generating all_blocks data 13531 1726882436.52545: done generating all_blocks data 13531 1726882436.52550: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882436.52551: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882436.52553: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882436.53795: done processing included file 13531 1726882436.53797: iterating over new_blocks loaded from include file 13531 1726882436.53799: in VariableManager get_vars() 13531 1726882436.53835: done with get_vars() 13531 1726882436.53837: filtering new block on tags 13531 1726882436.53855: done filtering new block on tags 13531 1726882436.53858: in VariableManager get_vars() 13531 1726882436.53894: done with get_vars() 13531 1726882436.53896: filtering new block on tags 13531 1726882436.53918: done filtering new block on tags 13531 1726882436.53920: in VariableManager get_vars() 13531 1726882436.53950: done with get_vars() 13531 1726882436.53952: filtering new block on tags 13531 1726882436.53972: done filtering new block on tags 13531 1726882436.53974: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 13531 1726882436.53979: extending task lists for all hosts with included blocks 13531 1726882436.54850: done extending task lists 13531 1726882436.54851: done processing included files 13531 1726882436.54852: results queue empty 13531 1726882436.54853: checking for any_errors_fatal 13531 1726882436.54858: done checking for any_errors_fatal 13531 1726882436.54859: checking for max_fail_percentage 13531 1726882436.54860: done checking for max_fail_percentage 13531 1726882436.54861: checking to see if all hosts have failed and the running result is not ok 13531 1726882436.54862: done checking to see if all hosts have failed 13531 1726882436.54862: getting the remaining hosts for this loop 13531 1726882436.54865: done getting the remaining hosts for this loop 13531 1726882436.54868: getting the next task for host managed_node2 13531 1726882436.54872: done getting next task for host managed_node2 13531 1726882436.54875: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13531 1726882436.54878: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882436.54888: getting variables 13531 1726882436.54889: in VariableManager get_vars() 13531 1726882436.54912: Calling all_inventory to load vars for managed_node2 13531 1726882436.54915: Calling groups_inventory to load vars for managed_node2 13531 1726882436.54917: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882436.54923: Calling all_plugins_play to load vars for managed_node2 13531 1726882436.54926: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882436.54929: Calling groups_plugins_play to load vars for managed_node2 13531 1726882436.56159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882436.57806: done with get_vars() 13531 1726882436.57835: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:33:56 -0400 (0:00:00.116) 0:00:24.474 ****** 13531 1726882436.57919: entering _queue_task() for managed_node2/setup 13531 1726882436.58253: worker is 1 (out of 1 available) 13531 1726882436.58267: exiting _queue_task() for managed_node2/setup 13531 1726882436.58279: done queuing things up, now waiting for results queue to drain 13531 1726882436.58280: waiting for pending results... 13531 1726882436.58576: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13531 1726882436.58767: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000006c5 13531 1726882436.58788: variable 'ansible_search_path' from source: unknown 13531 1726882436.58797: variable 'ansible_search_path' from source: unknown 13531 1726882436.58841: calling self._execute() 13531 1726882436.58928: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882436.58943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882436.58957: variable 'omit' from source: magic vars 13531 1726882436.59325: variable 'ansible_distribution_major_version' from source: facts 13531 1726882436.59345: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882436.59569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882436.61329: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882436.61498: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882436.61527: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882436.61552: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882436.61577: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882436.61636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882436.61657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882436.61679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882436.61706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882436.61716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882436.61754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882436.61774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882436.61791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882436.61816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882436.61826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882436.61937: variable '__network_required_facts' from source: role '' defaults 13531 1726882436.61947: variable 'ansible_facts' from source: unknown 13531 1726882436.67892: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13531 1726882436.67897: when evaluation is False, skipping this task 13531 1726882436.67900: _execute() done 13531 1726882436.67902: dumping result to json 13531 1726882436.67905: done dumping result, returning 13531 1726882436.67907: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4fd9-519d-0000000006c5] 13531 1726882436.67909: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000006c5 13531 1726882436.68032: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000006c5 13531 1726882436.68035: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882436.68074: no more pending results, returning what we have 13531 1726882436.68078: results queue empty 13531 1726882436.68079: checking for any_errors_fatal 13531 1726882436.68080: done checking for any_errors_fatal 13531 1726882436.68081: checking for max_fail_percentage 13531 1726882436.68083: done checking for max_fail_percentage 13531 1726882436.68083: checking to see if all hosts have failed and the running result is not ok 13531 1726882436.68084: done checking to see if all hosts have failed 13531 1726882436.68085: getting the remaining hosts for this loop 13531 1726882436.68086: done getting the remaining hosts for this loop 13531 1726882436.68089: getting the next task for host managed_node2 13531 1726882436.68096: done getting next task for host managed_node2 13531 1726882436.68100: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13531 1726882436.68104: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882436.68123: getting variables 13531 1726882436.68124: in VariableManager get_vars() 13531 1726882436.68177: Calling all_inventory to load vars for managed_node2 13531 1726882436.68180: Calling groups_inventory to load vars for managed_node2 13531 1726882436.68182: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882436.68192: Calling all_plugins_play to load vars for managed_node2 13531 1726882436.68195: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882436.68197: Calling groups_plugins_play to load vars for managed_node2 13531 1726882436.74051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882436.75736: done with get_vars() 13531 1726882436.75770: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:33:56 -0400 (0:00:00.179) 0:00:24.653 ****** 13531 1726882436.75861: entering _queue_task() for managed_node2/stat 13531 1726882436.76180: worker is 1 (out of 1 available) 13531 1726882436.76192: exiting _queue_task() for managed_node2/stat 13531 1726882436.76204: done queuing things up, now waiting for results queue to drain 13531 1726882436.76206: waiting for pending results... 13531 1726882436.76495: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 13531 1726882436.76653: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000006c7 13531 1726882436.76671: variable 'ansible_search_path' from source: unknown 13531 1726882436.76676: variable 'ansible_search_path' from source: unknown 13531 1726882436.76709: calling self._execute() 13531 1726882436.76810: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882436.76814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882436.76825: variable 'omit' from source: magic vars 13531 1726882436.77206: variable 'ansible_distribution_major_version' from source: facts 13531 1726882436.77218: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882436.77387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882436.77665: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882436.77707: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882436.77769: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882436.77803: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882436.77890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882436.77911: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882436.77932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882436.77960: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882436.78040: variable '__network_is_ostree' from source: set_fact 13531 1726882436.78046: Evaluated conditional (not __network_is_ostree is defined): False 13531 1726882436.78049: when evaluation is False, skipping this task 13531 1726882436.78058: _execute() done 13531 1726882436.78061: dumping result to json 13531 1726882436.78066: done dumping result, returning 13531 1726882436.78074: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4fd9-519d-0000000006c7] 13531 1726882436.78080: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000006c7 13531 1726882436.78176: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000006c7 13531 1726882436.78179: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13531 1726882436.78231: no more pending results, returning what we have 13531 1726882436.78236: results queue empty 13531 1726882436.78237: checking for any_errors_fatal 13531 1726882436.78246: done checking for any_errors_fatal 13531 1726882436.78247: checking for max_fail_percentage 13531 1726882436.78249: done checking for max_fail_percentage 13531 1726882436.78250: checking to see if all hosts have failed and the running result is not ok 13531 1726882436.78252: done checking to see if all hosts have failed 13531 1726882436.78252: getting the remaining hosts for this loop 13531 1726882436.78257: done getting the remaining hosts for this loop 13531 1726882436.78261: getting the next task for host managed_node2 13531 1726882436.78270: done getting next task for host managed_node2 13531 1726882436.78275: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13531 1726882436.78280: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882436.78301: getting variables 13531 1726882436.78303: in VariableManager get_vars() 13531 1726882436.78370: Calling all_inventory to load vars for managed_node2 13531 1726882436.78373: Calling groups_inventory to load vars for managed_node2 13531 1726882436.78376: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882436.78389: Calling all_plugins_play to load vars for managed_node2 13531 1726882436.78392: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882436.78395: Calling groups_plugins_play to load vars for managed_node2 13531 1726882436.80134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882436.81878: done with get_vars() 13531 1726882436.81906: done getting variables 13531 1726882436.81970: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:33:56 -0400 (0:00:00.061) 0:00:24.715 ****** 13531 1726882436.82007: entering _queue_task() for managed_node2/set_fact 13531 1726882436.82334: worker is 1 (out of 1 available) 13531 1726882436.82346: exiting _queue_task() for managed_node2/set_fact 13531 1726882436.82362: done queuing things up, now waiting for results queue to drain 13531 1726882436.82365: waiting for pending results... 13531 1726882436.82653: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13531 1726882436.82814: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000006c8 13531 1726882436.82827: variable 'ansible_search_path' from source: unknown 13531 1726882436.82830: variable 'ansible_search_path' from source: unknown 13531 1726882436.82872: calling self._execute() 13531 1726882436.82971: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882436.82975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882436.82985: variable 'omit' from source: magic vars 13531 1726882436.83368: variable 'ansible_distribution_major_version' from source: facts 13531 1726882436.83380: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882436.83545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882436.83826: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882436.83872: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882436.84241: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882436.84279: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882436.84368: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882436.84394: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882436.84419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882436.84448: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882436.84540: variable '__network_is_ostree' from source: set_fact 13531 1726882436.84547: Evaluated conditional (not __network_is_ostree is defined): False 13531 1726882436.84549: when evaluation is False, skipping this task 13531 1726882436.84552: _execute() done 13531 1726882436.84554: dumping result to json 13531 1726882436.84561: done dumping result, returning 13531 1726882436.84570: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4fd9-519d-0000000006c8] 13531 1726882436.84576: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000006c8 13531 1726882436.84673: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000006c8 13531 1726882436.84677: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13531 1726882436.84724: no more pending results, returning what we have 13531 1726882436.84728: results queue empty 13531 1726882436.84729: checking for any_errors_fatal 13531 1726882436.84736: done checking for any_errors_fatal 13531 1726882436.84736: checking for max_fail_percentage 13531 1726882436.84738: done checking for max_fail_percentage 13531 1726882436.84739: checking to see if all hosts have failed and the running result is not ok 13531 1726882436.84740: done checking to see if all hosts have failed 13531 1726882436.84741: getting the remaining hosts for this loop 13531 1726882436.84742: done getting the remaining hosts for this loop 13531 1726882436.84746: getting the next task for host managed_node2 13531 1726882436.84758: done getting next task for host managed_node2 13531 1726882436.84762: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13531 1726882436.84768: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882436.84785: getting variables 13531 1726882436.84788: in VariableManager get_vars() 13531 1726882436.84844: Calling all_inventory to load vars for managed_node2 13531 1726882436.84847: Calling groups_inventory to load vars for managed_node2 13531 1726882436.84850: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882436.84865: Calling all_plugins_play to load vars for managed_node2 13531 1726882436.84868: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882436.84871: Calling groups_plugins_play to load vars for managed_node2 13531 1726882436.86671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882436.88400: done with get_vars() 13531 1726882436.88432: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:33:56 -0400 (0:00:00.065) 0:00:24.780 ****** 13531 1726882436.88545: entering _queue_task() for managed_node2/service_facts 13531 1726882436.88906: worker is 1 (out of 1 available) 13531 1726882436.88917: exiting _queue_task() for managed_node2/service_facts 13531 1726882436.88931: done queuing things up, now waiting for results queue to drain 13531 1726882436.88932: waiting for pending results... 13531 1726882436.89248: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 13531 1726882436.89413: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000006ca 13531 1726882436.89426: variable 'ansible_search_path' from source: unknown 13531 1726882436.89430: variable 'ansible_search_path' from source: unknown 13531 1726882436.89473: calling self._execute() 13531 1726882436.89577: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882436.89581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882436.89592: variable 'omit' from source: magic vars 13531 1726882436.89990: variable 'ansible_distribution_major_version' from source: facts 13531 1726882436.90003: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882436.90009: variable 'omit' from source: magic vars 13531 1726882436.90091: variable 'omit' from source: magic vars 13531 1726882436.90126: variable 'omit' from source: magic vars 13531 1726882436.90179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882436.90212: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882436.90232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882436.90254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882436.90271: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882436.90301: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882436.90304: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882436.90307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882436.90418: Set connection var ansible_pipelining to False 13531 1726882436.90423: Set connection var ansible_timeout to 10 13531 1726882436.90429: Set connection var ansible_shell_executable to /bin/sh 13531 1726882436.90434: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882436.90437: Set connection var ansible_connection to ssh 13531 1726882436.90439: Set connection var ansible_shell_type to sh 13531 1726882436.90475: variable 'ansible_shell_executable' from source: unknown 13531 1726882436.90479: variable 'ansible_connection' from source: unknown 13531 1726882436.90482: variable 'ansible_module_compression' from source: unknown 13531 1726882436.90484: variable 'ansible_shell_type' from source: unknown 13531 1726882436.90487: variable 'ansible_shell_executable' from source: unknown 13531 1726882436.90489: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882436.90491: variable 'ansible_pipelining' from source: unknown 13531 1726882436.90494: variable 'ansible_timeout' from source: unknown 13531 1726882436.90498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882436.90687: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882436.90698: variable 'omit' from source: magic vars 13531 1726882436.90703: starting attempt loop 13531 1726882436.90706: running the handler 13531 1726882436.90718: _low_level_execute_command(): starting 13531 1726882436.90726: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882436.91468: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882436.91483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882436.91496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882436.91509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882436.91551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882436.91565: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882436.91579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882436.91593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882436.91602: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882436.91609: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882436.91617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882436.91626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882436.91637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882436.91646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882436.91657: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882436.91667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882436.91741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882436.91766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882436.91778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882436.91919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882436.93599: stdout chunk (state=3): >>>/root <<< 13531 1726882436.93702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882436.93759: stderr chunk (state=3): >>><<< 13531 1726882436.93762: stdout chunk (state=3): >>><<< 13531 1726882436.93786: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882436.93798: _low_level_execute_command(): starting 13531 1726882436.93806: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882436.9378593-14648-128814195139653 `" && echo ansible-tmp-1726882436.9378593-14648-128814195139653="` echo /root/.ansible/tmp/ansible-tmp-1726882436.9378593-14648-128814195139653 `" ) && sleep 0' 13531 1726882436.94272: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882436.94276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882436.94313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882436.94317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882436.94320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882436.94369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882436.94373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882436.94491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882436.96375: stdout chunk (state=3): >>>ansible-tmp-1726882436.9378593-14648-128814195139653=/root/.ansible/tmp/ansible-tmp-1726882436.9378593-14648-128814195139653 <<< 13531 1726882436.96488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882436.96569: stderr chunk (state=3): >>><<< 13531 1726882436.96572: stdout chunk (state=3): >>><<< 13531 1726882436.96649: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882436.9378593-14648-128814195139653=/root/.ansible/tmp/ansible-tmp-1726882436.9378593-14648-128814195139653 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882436.96794: variable 'ansible_module_compression' from source: unknown 13531 1726882436.96797: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13531 1726882436.96799: variable 'ansible_facts' from source: unknown 13531 1726882436.96914: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882436.9378593-14648-128814195139653/AnsiballZ_service_facts.py 13531 1726882436.97609: Sending initial data 13531 1726882436.97612: Sent initial data (162 bytes) 13531 1726882436.98872: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882436.98885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882436.98916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882436.98920: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882436.98922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882436.98982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882436.98986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882436.99103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882437.00875: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882437.00966: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882437.01054: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpbqhkfi4e /root/.ansible/tmp/ansible-tmp-1726882436.9378593-14648-128814195139653/AnsiballZ_service_facts.py <<< 13531 1726882437.01157: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882437.02557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882437.02751: stderr chunk (state=3): >>><<< 13531 1726882437.02757: stdout chunk (state=3): >>><<< 13531 1726882437.02760: done transferring module to remote 13531 1726882437.02762: _low_level_execute_command(): starting 13531 1726882437.02769: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882436.9378593-14648-128814195139653/ /root/.ansible/tmp/ansible-tmp-1726882436.9378593-14648-128814195139653/AnsiballZ_service_facts.py && sleep 0' 13531 1726882437.04279: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882437.04298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882437.04320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882437.04350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882437.04406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882437.04437: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882437.04457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882437.04478: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882437.04491: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882437.04509: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882437.04549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882437.04569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882437.04584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882437.04596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882437.04606: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882437.04620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882437.04710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882437.04747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882437.04779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882437.04911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882437.06819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882437.06822: stdout chunk (state=3): >>><<< 13531 1726882437.06825: stderr chunk (state=3): >>><<< 13531 1726882437.06923: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882437.06926: _low_level_execute_command(): starting 13531 1726882437.06929: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882436.9378593-14648-128814195139653/AnsiballZ_service_facts.py && sleep 0' 13531 1726882437.07588: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882437.07639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882437.07681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882437.07699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882437.07744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882437.07759: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882437.07775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882437.07792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882437.07802: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882437.07811: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882437.07821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882437.07832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882437.07847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882437.07866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882437.07877: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882437.07888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882437.07957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882437.07982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882437.07997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882437.08147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882438.43374: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.servi<<< 13531 1726882438.43388: stdout chunk (state=3): >>>ce": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "system<<< 13531 1726882438.43407: stdout chunk (state=3): >>>d"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13531 1726882438.44707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882438.44711: stdout chunk (state=3): >>><<< 13531 1726882438.44714: stderr chunk (state=3): >>><<< 13531 1726882438.45171: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882438.45538: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882436.9378593-14648-128814195139653/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882438.45582: _low_level_execute_command(): starting 13531 1726882438.45878: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882436.9378593-14648-128814195139653/ > /dev/null 2>&1 && sleep 0' 13531 1726882438.47303: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882438.48184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882438.48202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882438.48222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882438.48270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882438.48284: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882438.48298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882438.48318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882438.48326: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882438.48331: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882438.48340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882438.48349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882438.48360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882438.48370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882438.48377: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882438.48386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882438.48459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882438.48480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882438.48493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882438.48624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882438.50566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882438.50570: stdout chunk (state=3): >>><<< 13531 1726882438.50576: stderr chunk (state=3): >>><<< 13531 1726882438.50594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882438.50600: handler run complete 13531 1726882438.50774: variable 'ansible_facts' from source: unknown 13531 1726882438.50918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882438.51547: variable 'ansible_facts' from source: unknown 13531 1726882438.51781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882438.52022: attempt loop complete, returning result 13531 1726882438.52028: _execute() done 13531 1726882438.52031: dumping result to json 13531 1726882438.52092: done dumping result, returning 13531 1726882438.52100: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4fd9-519d-0000000006ca] 13531 1726882438.52107: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000006ca ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882438.53125: no more pending results, returning what we have 13531 1726882438.53127: results queue empty 13531 1726882438.53128: checking for any_errors_fatal 13531 1726882438.53133: done checking for any_errors_fatal 13531 1726882438.53134: checking for max_fail_percentage 13531 1726882438.53135: done checking for max_fail_percentage 13531 1726882438.53136: checking to see if all hosts have failed and the running result is not ok 13531 1726882438.53136: done checking to see if all hosts have failed 13531 1726882438.53137: getting the remaining hosts for this loop 13531 1726882438.53138: done getting the remaining hosts for this loop 13531 1726882438.53141: getting the next task for host managed_node2 13531 1726882438.53146: done getting next task for host managed_node2 13531 1726882438.53150: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13531 1726882438.53156: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882438.53169: getting variables 13531 1726882438.53170: in VariableManager get_vars() 13531 1726882438.53215: Calling all_inventory to load vars for managed_node2 13531 1726882438.53218: Calling groups_inventory to load vars for managed_node2 13531 1726882438.53220: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882438.53230: Calling all_plugins_play to load vars for managed_node2 13531 1726882438.53232: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882438.53235: Calling groups_plugins_play to load vars for managed_node2 13531 1726882438.53881: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000006ca 13531 1726882438.53889: WORKER PROCESS EXITING 13531 1726882438.54883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882438.55851: done with get_vars() 13531 1726882438.55871: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:33:58 -0400 (0:00:01.673) 0:00:26.454 ****** 13531 1726882438.55943: entering _queue_task() for managed_node2/package_facts 13531 1726882438.56183: worker is 1 (out of 1 available) 13531 1726882438.56196: exiting _queue_task() for managed_node2/package_facts 13531 1726882438.56209: done queuing things up, now waiting for results queue to drain 13531 1726882438.56211: waiting for pending results... 13531 1726882438.56582: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 13531 1726882438.56796: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000006cb 13531 1726882438.56819: variable 'ansible_search_path' from source: unknown 13531 1726882438.56836: variable 'ansible_search_path' from source: unknown 13531 1726882438.56882: calling self._execute() 13531 1726882438.56993: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882438.57010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882438.57023: variable 'omit' from source: magic vars 13531 1726882438.57436: variable 'ansible_distribution_major_version' from source: facts 13531 1726882438.57462: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882438.57486: variable 'omit' from source: magic vars 13531 1726882438.57575: variable 'omit' from source: magic vars 13531 1726882438.57619: variable 'omit' from source: magic vars 13531 1726882438.57677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882438.57728: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882438.57751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882438.57783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882438.57812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882438.57846: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882438.57856: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882438.57867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882438.58388: Set connection var ansible_pipelining to False 13531 1726882438.58406: Set connection var ansible_timeout to 10 13531 1726882438.58427: Set connection var ansible_shell_executable to /bin/sh 13531 1726882438.58437: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882438.58462: Set connection var ansible_connection to ssh 13531 1726882438.58472: Set connection var ansible_shell_type to sh 13531 1726882438.58518: variable 'ansible_shell_executable' from source: unknown 13531 1726882438.58537: variable 'ansible_connection' from source: unknown 13531 1726882438.58545: variable 'ansible_module_compression' from source: unknown 13531 1726882438.58552: variable 'ansible_shell_type' from source: unknown 13531 1726882438.58567: variable 'ansible_shell_executable' from source: unknown 13531 1726882438.58575: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882438.58584: variable 'ansible_pipelining' from source: unknown 13531 1726882438.58594: variable 'ansible_timeout' from source: unknown 13531 1726882438.58602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882438.58820: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882438.58841: variable 'omit' from source: magic vars 13531 1726882438.58852: starting attempt loop 13531 1726882438.58867: running the handler 13531 1726882438.58884: _low_level_execute_command(): starting 13531 1726882438.58895: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882438.59569: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882438.59602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882438.59605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 13531 1726882438.59610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882438.59673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882438.59675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882438.59677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882438.59773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882438.61432: stdout chunk (state=3): >>>/root <<< 13531 1726882438.61531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882438.61606: stderr chunk (state=3): >>><<< 13531 1726882438.61609: stdout chunk (state=3): >>><<< 13531 1726882438.61679: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882438.61683: _low_level_execute_command(): starting 13531 1726882438.61687: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882438.6162734-14716-116003872141687 `" && echo ansible-tmp-1726882438.6162734-14716-116003872141687="` echo /root/.ansible/tmp/ansible-tmp-1726882438.6162734-14716-116003872141687 `" ) && sleep 0' 13531 1726882438.62911: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882438.62929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882438.62975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882438.62987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882438.62990: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882438.63037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882438.63040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882438.63044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882438.63143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882438.65023: stdout chunk (state=3): >>>ansible-tmp-1726882438.6162734-14716-116003872141687=/root/.ansible/tmp/ansible-tmp-1726882438.6162734-14716-116003872141687 <<< 13531 1726882438.65139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882438.65220: stderr chunk (state=3): >>><<< 13531 1726882438.65230: stdout chunk (state=3): >>><<< 13531 1726882438.65573: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882438.6162734-14716-116003872141687=/root/.ansible/tmp/ansible-tmp-1726882438.6162734-14716-116003872141687 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882438.65576: variable 'ansible_module_compression' from source: unknown 13531 1726882438.65578: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13531 1726882438.65581: variable 'ansible_facts' from source: unknown 13531 1726882438.65670: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882438.6162734-14716-116003872141687/AnsiballZ_package_facts.py 13531 1726882438.65920: Sending initial data 13531 1726882438.65929: Sent initial data (162 bytes) 13531 1726882438.66877: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882438.66894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882438.66916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882438.66934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882438.66983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882438.66996: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882438.67016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882438.67033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882438.67043: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882438.67056: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882438.67072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882438.67086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882438.67101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882438.67113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882438.67133: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882438.67148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882438.67298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882438.67329: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882438.67357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882438.67485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882438.69250: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882438.69327: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882438.69434: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpgu_1ubj5 /root/.ansible/tmp/ansible-tmp-1726882438.6162734-14716-116003872141687/AnsiballZ_package_facts.py <<< 13531 1726882438.69528: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882438.71787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882438.71885: stderr chunk (state=3): >>><<< 13531 1726882438.71889: stdout chunk (state=3): >>><<< 13531 1726882438.71908: done transferring module to remote 13531 1726882438.71920: _low_level_execute_command(): starting 13531 1726882438.71923: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882438.6162734-14716-116003872141687/ /root/.ansible/tmp/ansible-tmp-1726882438.6162734-14716-116003872141687/AnsiballZ_package_facts.py && sleep 0' 13531 1726882438.72377: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882438.72383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882438.72427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882438.72430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882438.72437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882438.72497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882438.72500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882438.72603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882438.74422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882438.74468: stderr chunk (state=3): >>><<< 13531 1726882438.74472: stdout chunk (state=3): >>><<< 13531 1726882438.74489: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882438.74492: _low_level_execute_command(): starting 13531 1726882438.74500: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882438.6162734-14716-116003872141687/AnsiballZ_package_facts.py && sleep 0' 13531 1726882438.74940: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882438.74947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882438.74979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882438.74992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882438.75043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882438.75057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882438.75180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882439.21393: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86<<< 13531 1726882439.21411: stdout chunk (state=3): >>>_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pyt<<< 13531 1726882439.21423: stdout chunk (state=3): >>>hon3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoc<<< 13531 1726882439.21456: stdout chunk (state=3): >>>h": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch":<<< 13531 1726882439.21508: stdout chunk (state=3): >>> 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": <<< 13531 1726882439.21514: stdout chunk (state=3): >>>"rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm<<< 13531 1726882439.21517: stdout chunk (state=3): >>>"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1",<<< 13531 1726882439.21521: stdout chunk (state=3): >>> "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "per<<< 13531 1726882439.21523: stdout chunk (state=3): >>>l-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8<<< 13531 1726882439.21554: stdout chunk (state=3): >>>.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch":<<< 13531 1726882439.21577: stdout chunk (state=3): >>> null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "releas<<< 13531 1726882439.21586: stdout chunk (state=3): >>>e": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13531 1726882439.23047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882439.23110: stderr chunk (state=3): >>><<< 13531 1726882439.23113: stdout chunk (state=3): >>><<< 13531 1726882439.23149: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882439.24633: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882438.6162734-14716-116003872141687/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882439.24650: _low_level_execute_command(): starting 13531 1726882439.24657: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882438.6162734-14716-116003872141687/ > /dev/null 2>&1 && sleep 0' 13531 1726882439.25129: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882439.25141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882439.25171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882439.25183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882439.25228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882439.25242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882439.25260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882439.25364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882439.27181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882439.27229: stderr chunk (state=3): >>><<< 13531 1726882439.27232: stdout chunk (state=3): >>><<< 13531 1726882439.27244: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882439.27249: handler run complete 13531 1726882439.27780: variable 'ansible_facts' from source: unknown 13531 1726882439.28084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882439.30682: variable 'ansible_facts' from source: unknown 13531 1726882439.31175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882439.32000: attempt loop complete, returning result 13531 1726882439.32020: _execute() done 13531 1726882439.32027: dumping result to json 13531 1726882439.32250: done dumping result, returning 13531 1726882439.32270: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4fd9-519d-0000000006cb] 13531 1726882439.32283: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000006cb 13531 1726882439.34561: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000006cb 13531 1726882439.34567: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882439.34725: no more pending results, returning what we have 13531 1726882439.34728: results queue empty 13531 1726882439.34729: checking for any_errors_fatal 13531 1726882439.34735: done checking for any_errors_fatal 13531 1726882439.34736: checking for max_fail_percentage 13531 1726882439.34737: done checking for max_fail_percentage 13531 1726882439.34738: checking to see if all hosts have failed and the running result is not ok 13531 1726882439.34739: done checking to see if all hosts have failed 13531 1726882439.34740: getting the remaining hosts for this loop 13531 1726882439.34741: done getting the remaining hosts for this loop 13531 1726882439.34745: getting the next task for host managed_node2 13531 1726882439.34751: done getting next task for host managed_node2 13531 1726882439.34758: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13531 1726882439.34760: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882439.34774: getting variables 13531 1726882439.34776: in VariableManager get_vars() 13531 1726882439.34820: Calling all_inventory to load vars for managed_node2 13531 1726882439.34823: Calling groups_inventory to load vars for managed_node2 13531 1726882439.34825: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882439.34835: Calling all_plugins_play to load vars for managed_node2 13531 1726882439.34838: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882439.34841: Calling groups_plugins_play to load vars for managed_node2 13531 1726882439.36248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882439.37982: done with get_vars() 13531 1726882439.38009: done getting variables 13531 1726882439.38071: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:59 -0400 (0:00:00.821) 0:00:27.276 ****** 13531 1726882439.38104: entering _queue_task() for managed_node2/debug 13531 1726882439.38414: worker is 1 (out of 1 available) 13531 1726882439.38425: exiting _queue_task() for managed_node2/debug 13531 1726882439.38438: done queuing things up, now waiting for results queue to drain 13531 1726882439.38439: waiting for pending results... 13531 1726882439.38727: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 13531 1726882439.38885: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000007c 13531 1726882439.38905: variable 'ansible_search_path' from source: unknown 13531 1726882439.38912: variable 'ansible_search_path' from source: unknown 13531 1726882439.38952: calling self._execute() 13531 1726882439.39055: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882439.39070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882439.39084: variable 'omit' from source: magic vars 13531 1726882439.39471: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.39489: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882439.39502: variable 'omit' from source: magic vars 13531 1726882439.39570: variable 'omit' from source: magic vars 13531 1726882439.39674: variable 'network_provider' from source: set_fact 13531 1726882439.39696: variable 'omit' from source: magic vars 13531 1726882439.39741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882439.39784: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882439.39807: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882439.39828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882439.39841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882439.39882: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882439.39889: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882439.39896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882439.40005: Set connection var ansible_pipelining to False 13531 1726882439.40015: Set connection var ansible_timeout to 10 13531 1726882439.40023: Set connection var ansible_shell_executable to /bin/sh 13531 1726882439.40032: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882439.40037: Set connection var ansible_connection to ssh 13531 1726882439.40043: Set connection var ansible_shell_type to sh 13531 1726882439.40078: variable 'ansible_shell_executable' from source: unknown 13531 1726882439.40086: variable 'ansible_connection' from source: unknown 13531 1726882439.40093: variable 'ansible_module_compression' from source: unknown 13531 1726882439.40099: variable 'ansible_shell_type' from source: unknown 13531 1726882439.40105: variable 'ansible_shell_executable' from source: unknown 13531 1726882439.40111: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882439.40117: variable 'ansible_pipelining' from source: unknown 13531 1726882439.40123: variable 'ansible_timeout' from source: unknown 13531 1726882439.40130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882439.40271: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882439.40292: variable 'omit' from source: magic vars 13531 1726882439.40302: starting attempt loop 13531 1726882439.40309: running the handler 13531 1726882439.40357: handler run complete 13531 1726882439.40379: attempt loop complete, returning result 13531 1726882439.40387: _execute() done 13531 1726882439.40399: dumping result to json 13531 1726882439.40405: done dumping result, returning 13531 1726882439.40416: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4fd9-519d-00000000007c] 13531 1726882439.40426: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000007c ok: [managed_node2] => {} MSG: Using network provider: nm 13531 1726882439.40586: no more pending results, returning what we have 13531 1726882439.40589: results queue empty 13531 1726882439.40590: checking for any_errors_fatal 13531 1726882439.40600: done checking for any_errors_fatal 13531 1726882439.40601: checking for max_fail_percentage 13531 1726882439.40603: done checking for max_fail_percentage 13531 1726882439.40604: checking to see if all hosts have failed and the running result is not ok 13531 1726882439.40604: done checking to see if all hosts have failed 13531 1726882439.40605: getting the remaining hosts for this loop 13531 1726882439.40607: done getting the remaining hosts for this loop 13531 1726882439.40610: getting the next task for host managed_node2 13531 1726882439.40617: done getting next task for host managed_node2 13531 1726882439.40622: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13531 1726882439.40625: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882439.40636: getting variables 13531 1726882439.40638: in VariableManager get_vars() 13531 1726882439.40698: Calling all_inventory to load vars for managed_node2 13531 1726882439.40701: Calling groups_inventory to load vars for managed_node2 13531 1726882439.40704: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882439.40715: Calling all_plugins_play to load vars for managed_node2 13531 1726882439.40719: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882439.40722: Calling groups_plugins_play to load vars for managed_node2 13531 1726882439.41682: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000007c 13531 1726882439.41685: WORKER PROCESS EXITING 13531 1726882439.42565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882439.44401: done with get_vars() 13531 1726882439.44425: done getting variables 13531 1726882439.44487: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:59 -0400 (0:00:00.064) 0:00:27.340 ****** 13531 1726882439.44520: entering _queue_task() for managed_node2/fail 13531 1726882439.44832: worker is 1 (out of 1 available) 13531 1726882439.44844: exiting _queue_task() for managed_node2/fail 13531 1726882439.44859: done queuing things up, now waiting for results queue to drain 13531 1726882439.44861: waiting for pending results... 13531 1726882439.45174: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13531 1726882439.45321: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000007d 13531 1726882439.45341: variable 'ansible_search_path' from source: unknown 13531 1726882439.45349: variable 'ansible_search_path' from source: unknown 13531 1726882439.45393: calling self._execute() 13531 1726882439.45493: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882439.45504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882439.45518: variable 'omit' from source: magic vars 13531 1726882439.45892: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.45910: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882439.46032: variable 'network_state' from source: role '' defaults 13531 1726882439.46047: Evaluated conditional (network_state != {}): False 13531 1726882439.46060: when evaluation is False, skipping this task 13531 1726882439.46074: _execute() done 13531 1726882439.46083: dumping result to json 13531 1726882439.46091: done dumping result, returning 13531 1726882439.46103: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4fd9-519d-00000000007d] 13531 1726882439.46116: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000007d skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882439.46271: no more pending results, returning what we have 13531 1726882439.46275: results queue empty 13531 1726882439.46276: checking for any_errors_fatal 13531 1726882439.46285: done checking for any_errors_fatal 13531 1726882439.46285: checking for max_fail_percentage 13531 1726882439.46287: done checking for max_fail_percentage 13531 1726882439.46288: checking to see if all hosts have failed and the running result is not ok 13531 1726882439.46289: done checking to see if all hosts have failed 13531 1726882439.46290: getting the remaining hosts for this loop 13531 1726882439.46291: done getting the remaining hosts for this loop 13531 1726882439.46295: getting the next task for host managed_node2 13531 1726882439.46301: done getting next task for host managed_node2 13531 1726882439.46306: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13531 1726882439.46309: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882439.46330: getting variables 13531 1726882439.46333: in VariableManager get_vars() 13531 1726882439.46393: Calling all_inventory to load vars for managed_node2 13531 1726882439.46396: Calling groups_inventory to load vars for managed_node2 13531 1726882439.46398: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882439.46412: Calling all_plugins_play to load vars for managed_node2 13531 1726882439.46415: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882439.46418: Calling groups_plugins_play to load vars for managed_node2 13531 1726882439.47385: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000007d 13531 1726882439.47389: WORKER PROCESS EXITING 13531 1726882439.48239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882439.50009: done with get_vars() 13531 1726882439.50041: done getting variables 13531 1726882439.50106: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:59 -0400 (0:00:00.056) 0:00:27.396 ****** 13531 1726882439.50143: entering _queue_task() for managed_node2/fail 13531 1726882439.50485: worker is 1 (out of 1 available) 13531 1726882439.50499: exiting _queue_task() for managed_node2/fail 13531 1726882439.50511: done queuing things up, now waiting for results queue to drain 13531 1726882439.50512: waiting for pending results... 13531 1726882439.50815: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13531 1726882439.50975: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000007e 13531 1726882439.50994: variable 'ansible_search_path' from source: unknown 13531 1726882439.51002: variable 'ansible_search_path' from source: unknown 13531 1726882439.51043: calling self._execute() 13531 1726882439.51143: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882439.51157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882439.51175: variable 'omit' from source: magic vars 13531 1726882439.51550: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.51572: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882439.51704: variable 'network_state' from source: role '' defaults 13531 1726882439.51723: Evaluated conditional (network_state != {}): False 13531 1726882439.51730: when evaluation is False, skipping this task 13531 1726882439.51737: _execute() done 13531 1726882439.51743: dumping result to json 13531 1726882439.51749: done dumping result, returning 13531 1726882439.51764: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4fd9-519d-00000000007e] 13531 1726882439.51775: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000007e skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882439.51924: no more pending results, returning what we have 13531 1726882439.51928: results queue empty 13531 1726882439.51929: checking for any_errors_fatal 13531 1726882439.51938: done checking for any_errors_fatal 13531 1726882439.51938: checking for max_fail_percentage 13531 1726882439.51940: done checking for max_fail_percentage 13531 1726882439.51941: checking to see if all hosts have failed and the running result is not ok 13531 1726882439.51942: done checking to see if all hosts have failed 13531 1726882439.51943: getting the remaining hosts for this loop 13531 1726882439.51945: done getting the remaining hosts for this loop 13531 1726882439.51948: getting the next task for host managed_node2 13531 1726882439.51957: done getting next task for host managed_node2 13531 1726882439.51961: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13531 1726882439.51966: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882439.51986: getting variables 13531 1726882439.51988: in VariableManager get_vars() 13531 1726882439.52043: Calling all_inventory to load vars for managed_node2 13531 1726882439.52046: Calling groups_inventory to load vars for managed_node2 13531 1726882439.52049: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882439.52066: Calling all_plugins_play to load vars for managed_node2 13531 1726882439.52069: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882439.52073: Calling groups_plugins_play to load vars for managed_node2 13531 1726882439.53082: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000007e 13531 1726882439.53085: WORKER PROCESS EXITING 13531 1726882439.53962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882439.55705: done with get_vars() 13531 1726882439.55730: done getting variables 13531 1726882439.55792: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:59 -0400 (0:00:00.056) 0:00:27.453 ****** 13531 1726882439.55824: entering _queue_task() for managed_node2/fail 13531 1726882439.56141: worker is 1 (out of 1 available) 13531 1726882439.56153: exiting _queue_task() for managed_node2/fail 13531 1726882439.56170: done queuing things up, now waiting for results queue to drain 13531 1726882439.56172: waiting for pending results... 13531 1726882439.56465: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13531 1726882439.56610: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000007f 13531 1726882439.56630: variable 'ansible_search_path' from source: unknown 13531 1726882439.56637: variable 'ansible_search_path' from source: unknown 13531 1726882439.56684: calling self._execute() 13531 1726882439.56790: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882439.56801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882439.56815: variable 'omit' from source: magic vars 13531 1726882439.57194: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.57212: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882439.57366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882439.59827: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882439.59903: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882439.59943: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882439.59990: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882439.60022: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882439.60111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882439.60160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882439.60194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.60244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882439.60272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882439.60379: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.60398: Evaluated conditional (ansible_distribution_major_version | int > 9): False 13531 1726882439.60406: when evaluation is False, skipping this task 13531 1726882439.60414: _execute() done 13531 1726882439.60426: dumping result to json 13531 1726882439.60434: done dumping result, returning 13531 1726882439.60446: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4fd9-519d-00000000007f] 13531 1726882439.60461: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000007f 13531 1726882439.60592: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000007f 13531 1726882439.60598: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 13531 1726882439.60643: no more pending results, returning what we have 13531 1726882439.60647: results queue empty 13531 1726882439.60648: checking for any_errors_fatal 13531 1726882439.60659: done checking for any_errors_fatal 13531 1726882439.60659: checking for max_fail_percentage 13531 1726882439.60661: done checking for max_fail_percentage 13531 1726882439.60662: checking to see if all hosts have failed and the running result is not ok 13531 1726882439.60666: done checking to see if all hosts have failed 13531 1726882439.60666: getting the remaining hosts for this loop 13531 1726882439.60668: done getting the remaining hosts for this loop 13531 1726882439.60671: getting the next task for host managed_node2 13531 1726882439.60677: done getting next task for host managed_node2 13531 1726882439.60681: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13531 1726882439.60684: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882439.60701: getting variables 13531 1726882439.60703: in VariableManager get_vars() 13531 1726882439.60755: Calling all_inventory to load vars for managed_node2 13531 1726882439.60758: Calling groups_inventory to load vars for managed_node2 13531 1726882439.60760: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882439.60774: Calling all_plugins_play to load vars for managed_node2 13531 1726882439.60777: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882439.60780: Calling groups_plugins_play to load vars for managed_node2 13531 1726882439.61625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882439.62920: done with get_vars() 13531 1726882439.62943: done getting variables 13531 1726882439.63007: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:59 -0400 (0:00:00.072) 0:00:27.525 ****** 13531 1726882439.63038: entering _queue_task() for managed_node2/dnf 13531 1726882439.63369: worker is 1 (out of 1 available) 13531 1726882439.63381: exiting _queue_task() for managed_node2/dnf 13531 1726882439.63394: done queuing things up, now waiting for results queue to drain 13531 1726882439.63396: waiting for pending results... 13531 1726882439.63703: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13531 1726882439.63814: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000080 13531 1726882439.63825: variable 'ansible_search_path' from source: unknown 13531 1726882439.63828: variable 'ansible_search_path' from source: unknown 13531 1726882439.63867: calling self._execute() 13531 1726882439.63947: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882439.63951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882439.63964: variable 'omit' from source: magic vars 13531 1726882439.64234: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.64244: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882439.64389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882439.66151: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882439.66229: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882439.66277: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882439.66315: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882439.66347: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882439.66445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882439.66485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882439.66521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.66577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882439.66597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882439.66723: variable 'ansible_distribution' from source: facts 13531 1726882439.66736: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.66758: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13531 1726882439.66887: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882439.67030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882439.67068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882439.67097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.67142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882439.67171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882439.67215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882439.67242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882439.67278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.67322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882439.67340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882439.67393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882439.67420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882439.67450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.67504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882439.67525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882439.67701: variable 'network_connections' from source: task vars 13531 1726882439.67719: variable 'controller_profile' from source: play vars 13531 1726882439.67789: variable 'controller_profile' from source: play vars 13531 1726882439.67860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882439.67997: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882439.68025: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882439.68051: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882439.68076: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882439.68115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882439.68130: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882439.68154: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.68175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882439.68212: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882439.68375: variable 'network_connections' from source: task vars 13531 1726882439.68378: variable 'controller_profile' from source: play vars 13531 1726882439.68422: variable 'controller_profile' from source: play vars 13531 1726882439.68440: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882439.68443: when evaluation is False, skipping this task 13531 1726882439.68445: _execute() done 13531 1726882439.68448: dumping result to json 13531 1726882439.68450: done dumping result, returning 13531 1726882439.68461: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-000000000080] 13531 1726882439.68470: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000080 13531 1726882439.68558: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000080 13531 1726882439.68561: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882439.68621: no more pending results, returning what we have 13531 1726882439.68624: results queue empty 13531 1726882439.68625: checking for any_errors_fatal 13531 1726882439.68632: done checking for any_errors_fatal 13531 1726882439.68633: checking for max_fail_percentage 13531 1726882439.68635: done checking for max_fail_percentage 13531 1726882439.68636: checking to see if all hosts have failed and the running result is not ok 13531 1726882439.68636: done checking to see if all hosts have failed 13531 1726882439.68637: getting the remaining hosts for this loop 13531 1726882439.68638: done getting the remaining hosts for this loop 13531 1726882439.68642: getting the next task for host managed_node2 13531 1726882439.68649: done getting next task for host managed_node2 13531 1726882439.68653: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13531 1726882439.68655: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882439.68676: getting variables 13531 1726882439.68678: in VariableManager get_vars() 13531 1726882439.68726: Calling all_inventory to load vars for managed_node2 13531 1726882439.68729: Calling groups_inventory to load vars for managed_node2 13531 1726882439.68731: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882439.68741: Calling all_plugins_play to load vars for managed_node2 13531 1726882439.68743: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882439.68746: Calling groups_plugins_play to load vars for managed_node2 13531 1726882439.69576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882439.71169: done with get_vars() 13531 1726882439.71197: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13531 1726882439.71282: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:59 -0400 (0:00:00.082) 0:00:27.608 ****** 13531 1726882439.71311: entering _queue_task() for managed_node2/yum 13531 1726882439.71642: worker is 1 (out of 1 available) 13531 1726882439.71657: exiting _queue_task() for managed_node2/yum 13531 1726882439.71672: done queuing things up, now waiting for results queue to drain 13531 1726882439.71673: waiting for pending results... 13531 1726882439.71983: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13531 1726882439.72104: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000081 13531 1726882439.72127: variable 'ansible_search_path' from source: unknown 13531 1726882439.72137: variable 'ansible_search_path' from source: unknown 13531 1726882439.72180: calling self._execute() 13531 1726882439.72266: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882439.72276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882439.72284: variable 'omit' from source: magic vars 13531 1726882439.72595: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.72612: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882439.72783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882439.74700: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882439.74742: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882439.74775: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882439.74800: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882439.74821: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882439.74882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882439.75137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882439.75159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.75189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882439.75206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882439.75272: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.75284: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13531 1726882439.75286: when evaluation is False, skipping this task 13531 1726882439.75289: _execute() done 13531 1726882439.75291: dumping result to json 13531 1726882439.75300: done dumping result, returning 13531 1726882439.75307: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-000000000081] 13531 1726882439.75314: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000081 13531 1726882439.75402: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000081 13531 1726882439.75405: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13531 1726882439.75452: no more pending results, returning what we have 13531 1726882439.75458: results queue empty 13531 1726882439.75459: checking for any_errors_fatal 13531 1726882439.75469: done checking for any_errors_fatal 13531 1726882439.75470: checking for max_fail_percentage 13531 1726882439.75472: done checking for max_fail_percentage 13531 1726882439.75473: checking to see if all hosts have failed and the running result is not ok 13531 1726882439.75474: done checking to see if all hosts have failed 13531 1726882439.75475: getting the remaining hosts for this loop 13531 1726882439.75476: done getting the remaining hosts for this loop 13531 1726882439.75480: getting the next task for host managed_node2 13531 1726882439.75486: done getting next task for host managed_node2 13531 1726882439.75490: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13531 1726882439.75493: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882439.75510: getting variables 13531 1726882439.75511: in VariableManager get_vars() 13531 1726882439.75564: Calling all_inventory to load vars for managed_node2 13531 1726882439.75567: Calling groups_inventory to load vars for managed_node2 13531 1726882439.75569: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882439.75580: Calling all_plugins_play to load vars for managed_node2 13531 1726882439.75583: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882439.75585: Calling groups_plugins_play to load vars for managed_node2 13531 1726882439.77012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882439.77981: done with get_vars() 13531 1726882439.77997: done getting variables 13531 1726882439.78037: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:59 -0400 (0:00:00.067) 0:00:27.675 ****** 13531 1726882439.78065: entering _queue_task() for managed_node2/fail 13531 1726882439.78280: worker is 1 (out of 1 available) 13531 1726882439.78293: exiting _queue_task() for managed_node2/fail 13531 1726882439.78305: done queuing things up, now waiting for results queue to drain 13531 1726882439.78306: waiting for pending results... 13531 1726882439.78486: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13531 1726882439.78579: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000082 13531 1726882439.78589: variable 'ansible_search_path' from source: unknown 13531 1726882439.78593: variable 'ansible_search_path' from source: unknown 13531 1726882439.78625: calling self._execute() 13531 1726882439.78696: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882439.78700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882439.78709: variable 'omit' from source: magic vars 13531 1726882439.78988: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.78998: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882439.79083: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882439.79213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882439.80784: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882439.80826: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882439.80852: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882439.80880: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882439.80904: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882439.80961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882439.80997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882439.81017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.81043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882439.81057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882439.81090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882439.81112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882439.81129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.81157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882439.81168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882439.81196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882439.81218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882439.81234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.81260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882439.81273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882439.81394: variable 'network_connections' from source: task vars 13531 1726882439.81402: variable 'controller_profile' from source: play vars 13531 1726882439.81449: variable 'controller_profile' from source: play vars 13531 1726882439.81501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882439.81608: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882439.81636: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882439.81658: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882439.81686: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882439.81715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882439.81730: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882439.81748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.81769: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882439.81806: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882439.81959: variable 'network_connections' from source: task vars 13531 1726882439.81963: variable 'controller_profile' from source: play vars 13531 1726882439.82007: variable 'controller_profile' from source: play vars 13531 1726882439.82025: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882439.82029: when evaluation is False, skipping this task 13531 1726882439.82031: _execute() done 13531 1726882439.82034: dumping result to json 13531 1726882439.82036: done dumping result, returning 13531 1726882439.82042: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-000000000082] 13531 1726882439.82047: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000082 13531 1726882439.82142: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000082 13531 1726882439.82144: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882439.82200: no more pending results, returning what we have 13531 1726882439.82204: results queue empty 13531 1726882439.82205: checking for any_errors_fatal 13531 1726882439.82212: done checking for any_errors_fatal 13531 1726882439.82213: checking for max_fail_percentage 13531 1726882439.82215: done checking for max_fail_percentage 13531 1726882439.82216: checking to see if all hosts have failed and the running result is not ok 13531 1726882439.82217: done checking to see if all hosts have failed 13531 1726882439.82218: getting the remaining hosts for this loop 13531 1726882439.82219: done getting the remaining hosts for this loop 13531 1726882439.82222: getting the next task for host managed_node2 13531 1726882439.82228: done getting next task for host managed_node2 13531 1726882439.82232: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13531 1726882439.82234: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882439.82250: getting variables 13531 1726882439.82252: in VariableManager get_vars() 13531 1726882439.82308: Calling all_inventory to load vars for managed_node2 13531 1726882439.82311: Calling groups_inventory to load vars for managed_node2 13531 1726882439.82313: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882439.82322: Calling all_plugins_play to load vars for managed_node2 13531 1726882439.82325: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882439.82327: Calling groups_plugins_play to load vars for managed_node2 13531 1726882439.83132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882439.84073: done with get_vars() 13531 1726882439.84088: done getting variables 13531 1726882439.84127: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:59 -0400 (0:00:00.060) 0:00:27.736 ****** 13531 1726882439.84151: entering _queue_task() for managed_node2/package 13531 1726882439.84348: worker is 1 (out of 1 available) 13531 1726882439.84364: exiting _queue_task() for managed_node2/package 13531 1726882439.84377: done queuing things up, now waiting for results queue to drain 13531 1726882439.84378: waiting for pending results... 13531 1726882439.84555: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 13531 1726882439.84636: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000083 13531 1726882439.84646: variable 'ansible_search_path' from source: unknown 13531 1726882439.84649: variable 'ansible_search_path' from source: unknown 13531 1726882439.84682: calling self._execute() 13531 1726882439.84754: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882439.84761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882439.84771: variable 'omit' from source: magic vars 13531 1726882439.85037: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.85048: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882439.85186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882439.85375: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882439.85405: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882439.85430: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882439.85489: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882439.85567: variable 'network_packages' from source: role '' defaults 13531 1726882439.85637: variable '__network_provider_setup' from source: role '' defaults 13531 1726882439.85646: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882439.85695: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882439.85703: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882439.85745: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882439.85868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882439.87441: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882439.87492: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882439.87519: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882439.87543: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882439.87567: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882439.87621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882439.87640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882439.87661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.87689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882439.87700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882439.87731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882439.87747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882439.87768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.87793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882439.87803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882439.87944: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13531 1726882439.88014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882439.88030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882439.88047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.88081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882439.88092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882439.88150: variable 'ansible_python' from source: facts 13531 1726882439.88176: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13531 1726882439.88231: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882439.88292: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882439.88376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882439.88394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882439.88411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.88436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882439.88447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882439.88485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882439.88505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882439.88523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.88548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882439.88561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882439.88656: variable 'network_connections' from source: task vars 13531 1726882439.88666: variable 'controller_profile' from source: play vars 13531 1726882439.88734: variable 'controller_profile' from source: play vars 13531 1726882439.88785: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882439.88806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882439.88826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882439.88848: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882439.88885: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882439.89066: variable 'network_connections' from source: task vars 13531 1726882439.89069: variable 'controller_profile' from source: play vars 13531 1726882439.89136: variable 'controller_profile' from source: play vars 13531 1726882439.89162: variable '__network_packages_default_wireless' from source: role '' defaults 13531 1726882439.89218: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882439.89416: variable 'network_connections' from source: task vars 13531 1726882439.89420: variable 'controller_profile' from source: play vars 13531 1726882439.89468: variable 'controller_profile' from source: play vars 13531 1726882439.89487: variable '__network_packages_default_team' from source: role '' defaults 13531 1726882439.89542: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882439.89740: variable 'network_connections' from source: task vars 13531 1726882439.89744: variable 'controller_profile' from source: play vars 13531 1726882439.89790: variable 'controller_profile' from source: play vars 13531 1726882439.89830: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882439.89875: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882439.89881: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882439.89928: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882439.90068: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13531 1726882439.90370: variable 'network_connections' from source: task vars 13531 1726882439.90376: variable 'controller_profile' from source: play vars 13531 1726882439.90416: variable 'controller_profile' from source: play vars 13531 1726882439.90423: variable 'ansible_distribution' from source: facts 13531 1726882439.90425: variable '__network_rh_distros' from source: role '' defaults 13531 1726882439.90432: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.90445: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13531 1726882439.90551: variable 'ansible_distribution' from source: facts 13531 1726882439.90558: variable '__network_rh_distros' from source: role '' defaults 13531 1726882439.90561: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.90575: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13531 1726882439.90686: variable 'ansible_distribution' from source: facts 13531 1726882439.90689: variable '__network_rh_distros' from source: role '' defaults 13531 1726882439.90693: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.90718: variable 'network_provider' from source: set_fact 13531 1726882439.90728: variable 'ansible_facts' from source: unknown 13531 1726882439.91101: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13531 1726882439.91105: when evaluation is False, skipping this task 13531 1726882439.91107: _execute() done 13531 1726882439.91110: dumping result to json 13531 1726882439.91112: done dumping result, returning 13531 1726882439.91118: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4fd9-519d-000000000083] 13531 1726882439.91123: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000083 13531 1726882439.91216: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000083 13531 1726882439.91218: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13531 1726882439.91281: no more pending results, returning what we have 13531 1726882439.91285: results queue empty 13531 1726882439.91286: checking for any_errors_fatal 13531 1726882439.91293: done checking for any_errors_fatal 13531 1726882439.91294: checking for max_fail_percentage 13531 1726882439.91296: done checking for max_fail_percentage 13531 1726882439.91296: checking to see if all hosts have failed and the running result is not ok 13531 1726882439.91297: done checking to see if all hosts have failed 13531 1726882439.91298: getting the remaining hosts for this loop 13531 1726882439.91300: done getting the remaining hosts for this loop 13531 1726882439.91303: getting the next task for host managed_node2 13531 1726882439.91314: done getting next task for host managed_node2 13531 1726882439.91318: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13531 1726882439.91321: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882439.91334: getting variables 13531 1726882439.91336: in VariableManager get_vars() 13531 1726882439.91384: Calling all_inventory to load vars for managed_node2 13531 1726882439.91386: Calling groups_inventory to load vars for managed_node2 13531 1726882439.91389: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882439.91397: Calling all_plugins_play to load vars for managed_node2 13531 1726882439.91400: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882439.91402: Calling groups_plugins_play to load vars for managed_node2 13531 1726882439.92317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882439.93259: done with get_vars() 13531 1726882439.93276: done getting variables 13531 1726882439.93318: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:59 -0400 (0:00:00.091) 0:00:27.828 ****** 13531 1726882439.93339: entering _queue_task() for managed_node2/package 13531 1726882439.93551: worker is 1 (out of 1 available) 13531 1726882439.93568: exiting _queue_task() for managed_node2/package 13531 1726882439.93581: done queuing things up, now waiting for results queue to drain 13531 1726882439.93583: waiting for pending results... 13531 1726882439.93762: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13531 1726882439.93865: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000084 13531 1726882439.93876: variable 'ansible_search_path' from source: unknown 13531 1726882439.93880: variable 'ansible_search_path' from source: unknown 13531 1726882439.93911: calling self._execute() 13531 1726882439.93985: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882439.93989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882439.93996: variable 'omit' from source: magic vars 13531 1726882439.94267: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.94278: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882439.94361: variable 'network_state' from source: role '' defaults 13531 1726882439.94371: Evaluated conditional (network_state != {}): False 13531 1726882439.94374: when evaluation is False, skipping this task 13531 1726882439.94376: _execute() done 13531 1726882439.94379: dumping result to json 13531 1726882439.94381: done dumping result, returning 13531 1726882439.94389: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4fd9-519d-000000000084] 13531 1726882439.94394: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000084 13531 1726882439.94488: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000084 13531 1726882439.94491: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882439.94534: no more pending results, returning what we have 13531 1726882439.94537: results queue empty 13531 1726882439.94538: checking for any_errors_fatal 13531 1726882439.94543: done checking for any_errors_fatal 13531 1726882439.94543: checking for max_fail_percentage 13531 1726882439.94545: done checking for max_fail_percentage 13531 1726882439.94546: checking to see if all hosts have failed and the running result is not ok 13531 1726882439.94547: done checking to see if all hosts have failed 13531 1726882439.94548: getting the remaining hosts for this loop 13531 1726882439.94549: done getting the remaining hosts for this loop 13531 1726882439.94552: getting the next task for host managed_node2 13531 1726882439.94557: done getting next task for host managed_node2 13531 1726882439.94561: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13531 1726882439.94565: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882439.94587: getting variables 13531 1726882439.94589: in VariableManager get_vars() 13531 1726882439.94630: Calling all_inventory to load vars for managed_node2 13531 1726882439.94633: Calling groups_inventory to load vars for managed_node2 13531 1726882439.94635: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882439.94643: Calling all_plugins_play to load vars for managed_node2 13531 1726882439.94645: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882439.94647: Calling groups_plugins_play to load vars for managed_node2 13531 1726882439.95420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882439.96447: done with get_vars() 13531 1726882439.96462: done getting variables 13531 1726882439.96504: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:59 -0400 (0:00:00.031) 0:00:27.860 ****** 13531 1726882439.96525: entering _queue_task() for managed_node2/package 13531 1726882439.96717: worker is 1 (out of 1 available) 13531 1726882439.96730: exiting _queue_task() for managed_node2/package 13531 1726882439.96743: done queuing things up, now waiting for results queue to drain 13531 1726882439.96744: waiting for pending results... 13531 1726882439.96922: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13531 1726882439.97014: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000085 13531 1726882439.97025: variable 'ansible_search_path' from source: unknown 13531 1726882439.97028: variable 'ansible_search_path' from source: unknown 13531 1726882439.97059: calling self._execute() 13531 1726882439.97130: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882439.97133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882439.97141: variable 'omit' from source: magic vars 13531 1726882439.97402: variable 'ansible_distribution_major_version' from source: facts 13531 1726882439.97413: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882439.97498: variable 'network_state' from source: role '' defaults 13531 1726882439.97507: Evaluated conditional (network_state != {}): False 13531 1726882439.97510: when evaluation is False, skipping this task 13531 1726882439.97514: _execute() done 13531 1726882439.97517: dumping result to json 13531 1726882439.97519: done dumping result, returning 13531 1726882439.97525: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4fd9-519d-000000000085] 13531 1726882439.97537: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000085 13531 1726882439.97623: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000085 13531 1726882439.97625: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882439.97681: no more pending results, returning what we have 13531 1726882439.97684: results queue empty 13531 1726882439.97685: checking for any_errors_fatal 13531 1726882439.97690: done checking for any_errors_fatal 13531 1726882439.97691: checking for max_fail_percentage 13531 1726882439.97692: done checking for max_fail_percentage 13531 1726882439.97693: checking to see if all hosts have failed and the running result is not ok 13531 1726882439.97694: done checking to see if all hosts have failed 13531 1726882439.97695: getting the remaining hosts for this loop 13531 1726882439.97696: done getting the remaining hosts for this loop 13531 1726882439.97698: getting the next task for host managed_node2 13531 1726882439.97703: done getting next task for host managed_node2 13531 1726882439.97707: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13531 1726882439.97710: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882439.97725: getting variables 13531 1726882439.97727: in VariableManager get_vars() 13531 1726882439.97773: Calling all_inventory to load vars for managed_node2 13531 1726882439.97775: Calling groups_inventory to load vars for managed_node2 13531 1726882439.97777: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882439.97783: Calling all_plugins_play to load vars for managed_node2 13531 1726882439.97785: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882439.97787: Calling groups_plugins_play to load vars for managed_node2 13531 1726882439.98553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882439.99504: done with get_vars() 13531 1726882439.99519: done getting variables 13531 1726882439.99560: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:59 -0400 (0:00:00.030) 0:00:27.891 ****** 13531 1726882439.99585: entering _queue_task() for managed_node2/service 13531 1726882439.99764: worker is 1 (out of 1 available) 13531 1726882439.99777: exiting _queue_task() for managed_node2/service 13531 1726882439.99790: done queuing things up, now waiting for results queue to drain 13531 1726882439.99791: waiting for pending results... 13531 1726882439.99965: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13531 1726882440.00055: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000086 13531 1726882440.00069: variable 'ansible_search_path' from source: unknown 13531 1726882440.00073: variable 'ansible_search_path' from source: unknown 13531 1726882440.00100: calling self._execute() 13531 1726882440.00171: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882440.00174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882440.00182: variable 'omit' from source: magic vars 13531 1726882440.00449: variable 'ansible_distribution_major_version' from source: facts 13531 1726882440.00462: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882440.00541: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882440.00685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882440.02242: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882440.02289: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882440.02315: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882440.02342: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882440.02367: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882440.02429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882440.02686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882440.02704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.02730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882440.02742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882440.02781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882440.02797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882440.02814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.02839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882440.02858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882440.02887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882440.02902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882440.02919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.02943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882440.02957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882440.03071: variable 'network_connections' from source: task vars 13531 1726882440.03084: variable 'controller_profile' from source: play vars 13531 1726882440.03129: variable 'controller_profile' from source: play vars 13531 1726882440.03182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882440.03287: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882440.03390: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882440.03427: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882440.03451: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882440.03485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882440.03500: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882440.03519: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.03537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882440.03577: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882440.03724: variable 'network_connections' from source: task vars 13531 1726882440.03734: variable 'controller_profile' from source: play vars 13531 1726882440.03776: variable 'controller_profile' from source: play vars 13531 1726882440.03794: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882440.03797: when evaluation is False, skipping this task 13531 1726882440.03800: _execute() done 13531 1726882440.03802: dumping result to json 13531 1726882440.03805: done dumping result, returning 13531 1726882440.03810: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-000000000086] 13531 1726882440.03816: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000086 13531 1726882440.03912: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000086 13531 1726882440.03920: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882440.03973: no more pending results, returning what we have 13531 1726882440.03976: results queue empty 13531 1726882440.03977: checking for any_errors_fatal 13531 1726882440.03983: done checking for any_errors_fatal 13531 1726882440.03984: checking for max_fail_percentage 13531 1726882440.03986: done checking for max_fail_percentage 13531 1726882440.03987: checking to see if all hosts have failed and the running result is not ok 13531 1726882440.03988: done checking to see if all hosts have failed 13531 1726882440.03988: getting the remaining hosts for this loop 13531 1726882440.03990: done getting the remaining hosts for this loop 13531 1726882440.03993: getting the next task for host managed_node2 13531 1726882440.03998: done getting next task for host managed_node2 13531 1726882440.04002: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13531 1726882440.04005: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882440.04022: getting variables 13531 1726882440.04024: in VariableManager get_vars() 13531 1726882440.04084: Calling all_inventory to load vars for managed_node2 13531 1726882440.04087: Calling groups_inventory to load vars for managed_node2 13531 1726882440.04089: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882440.04099: Calling all_plugins_play to load vars for managed_node2 13531 1726882440.04101: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882440.04103: Calling groups_plugins_play to load vars for managed_node2 13531 1726882440.09333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882440.11197: done with get_vars() 13531 1726882440.11221: done getting variables 13531 1726882440.11273: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:34:00 -0400 (0:00:00.117) 0:00:28.008 ****** 13531 1726882440.11309: entering _queue_task() for managed_node2/service 13531 1726882440.12033: worker is 1 (out of 1 available) 13531 1726882440.12045: exiting _queue_task() for managed_node2/service 13531 1726882440.12057: done queuing things up, now waiting for results queue to drain 13531 1726882440.12059: waiting for pending results... 13531 1726882440.13482: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13531 1726882440.13885: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000087 13531 1726882440.13905: variable 'ansible_search_path' from source: unknown 13531 1726882440.13914: variable 'ansible_search_path' from source: unknown 13531 1726882440.13970: calling self._execute() 13531 1726882440.14139: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882440.14158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882440.14175: variable 'omit' from source: magic vars 13531 1726882440.14561: variable 'ansible_distribution_major_version' from source: facts 13531 1726882440.14585: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882440.14762: variable 'network_provider' from source: set_fact 13531 1726882440.14776: variable 'network_state' from source: role '' defaults 13531 1726882440.14792: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13531 1726882440.14806: variable 'omit' from source: magic vars 13531 1726882440.14864: variable 'omit' from source: magic vars 13531 1726882440.14898: variable 'network_service_name' from source: role '' defaults 13531 1726882440.14974: variable 'network_service_name' from source: role '' defaults 13531 1726882440.15089: variable '__network_provider_setup' from source: role '' defaults 13531 1726882440.15100: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882440.15171: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882440.15184: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882440.15252: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882440.15493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882440.17785: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882440.17870: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882440.17910: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882440.17946: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882440.17987: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882440.18075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882440.18111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882440.18142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.18197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882440.18218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882440.18271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882440.18306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882440.18336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.18386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882440.18413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882440.18658: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13531 1726882440.18780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882440.18808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882440.18841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.18890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882440.18907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882440.19004: variable 'ansible_python' from source: facts 13531 1726882440.19031: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13531 1726882440.19124: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882440.19212: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882440.19345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882440.19381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882440.19411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.19478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882440.19490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882440.19525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882440.19545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882440.19577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.19615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882440.19629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882440.19721: variable 'network_connections' from source: task vars 13531 1726882440.19727: variable 'controller_profile' from source: play vars 13531 1726882440.19787: variable 'controller_profile' from source: play vars 13531 1726882440.19859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882440.19991: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882440.20024: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882440.20059: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882440.20089: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882440.20130: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882440.20152: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882440.20179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.20201: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882440.20236: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882440.20415: variable 'network_connections' from source: task vars 13531 1726882440.20421: variable 'controller_profile' from source: play vars 13531 1726882440.20476: variable 'controller_profile' from source: play vars 13531 1726882440.20501: variable '__network_packages_default_wireless' from source: role '' defaults 13531 1726882440.20556: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882440.20743: variable 'network_connections' from source: task vars 13531 1726882440.20746: variable 'controller_profile' from source: play vars 13531 1726882440.20800: variable 'controller_profile' from source: play vars 13531 1726882440.20813: variable '__network_packages_default_team' from source: role '' defaults 13531 1726882440.20868: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882440.21050: variable 'network_connections' from source: task vars 13531 1726882440.21056: variable 'controller_profile' from source: play vars 13531 1726882440.21104: variable 'controller_profile' from source: play vars 13531 1726882440.21170: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882440.21201: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882440.21208: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882440.21277: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882440.21523: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13531 1726882440.21999: variable 'network_connections' from source: task vars 13531 1726882440.22003: variable 'controller_profile' from source: play vars 13531 1726882440.22061: variable 'controller_profile' from source: play vars 13531 1726882440.22072: variable 'ansible_distribution' from source: facts 13531 1726882440.22075: variable '__network_rh_distros' from source: role '' defaults 13531 1726882440.22081: variable 'ansible_distribution_major_version' from source: facts 13531 1726882440.22095: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13531 1726882440.22373: variable 'ansible_distribution' from source: facts 13531 1726882440.22376: variable '__network_rh_distros' from source: role '' defaults 13531 1726882440.22382: variable 'ansible_distribution_major_version' from source: facts 13531 1726882440.22395: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13531 1726882440.22555: variable 'ansible_distribution' from source: facts 13531 1726882440.22587: variable '__network_rh_distros' from source: role '' defaults 13531 1726882440.22592: variable 'ansible_distribution_major_version' from source: facts 13531 1726882440.22609: variable 'network_provider' from source: set_fact 13531 1726882440.22631: variable 'omit' from source: magic vars 13531 1726882440.22652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882440.22678: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882440.22697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882440.22709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882440.22717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882440.22738: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882440.22750: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882440.22755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882440.22822: Set connection var ansible_pipelining to False 13531 1726882440.22827: Set connection var ansible_timeout to 10 13531 1726882440.22832: Set connection var ansible_shell_executable to /bin/sh 13531 1726882440.22837: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882440.22840: Set connection var ansible_connection to ssh 13531 1726882440.22842: Set connection var ansible_shell_type to sh 13531 1726882440.22865: variable 'ansible_shell_executable' from source: unknown 13531 1726882440.22868: variable 'ansible_connection' from source: unknown 13531 1726882440.22871: variable 'ansible_module_compression' from source: unknown 13531 1726882440.22873: variable 'ansible_shell_type' from source: unknown 13531 1726882440.22876: variable 'ansible_shell_executable' from source: unknown 13531 1726882440.22878: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882440.22880: variable 'ansible_pipelining' from source: unknown 13531 1726882440.22882: variable 'ansible_timeout' from source: unknown 13531 1726882440.22887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882440.22957: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882440.22967: variable 'omit' from source: magic vars 13531 1726882440.22973: starting attempt loop 13531 1726882440.22975: running the handler 13531 1726882440.23032: variable 'ansible_facts' from source: unknown 13531 1726882440.23511: _low_level_execute_command(): starting 13531 1726882440.23515: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882440.23995: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882440.24013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882440.24037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882440.24049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882440.24101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882440.24116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882440.24237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882440.25898: stdout chunk (state=3): >>>/root <<< 13531 1726882440.26000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882440.26043: stderr chunk (state=3): >>><<< 13531 1726882440.26047: stdout chunk (state=3): >>><<< 13531 1726882440.26068: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882440.26082: _low_level_execute_command(): starting 13531 1726882440.26085: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882440.2606819-14771-90377820002549 `" && echo ansible-tmp-1726882440.2606819-14771-90377820002549="` echo /root/.ansible/tmp/ansible-tmp-1726882440.2606819-14771-90377820002549 `" ) && sleep 0' 13531 1726882440.26527: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882440.26537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882440.26542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882440.26554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882440.26587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882440.26594: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882440.26603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882440.26613: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882440.26620: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882440.26626: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882440.26635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882440.26641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882440.26648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882440.26655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882440.26709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882440.26728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882440.26835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882440.28707: stdout chunk (state=3): >>>ansible-tmp-1726882440.2606819-14771-90377820002549=/root/.ansible/tmp/ansible-tmp-1726882440.2606819-14771-90377820002549 <<< 13531 1726882440.29365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882440.29369: stderr chunk (state=3): >>><<< 13531 1726882440.29371: stdout chunk (state=3): >>><<< 13531 1726882440.29374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882440.2606819-14771-90377820002549=/root/.ansible/tmp/ansible-tmp-1726882440.2606819-14771-90377820002549 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882440.29377: variable 'ansible_module_compression' from source: unknown 13531 1726882440.29379: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13531 1726882440.29383: variable 'ansible_facts' from source: unknown 13531 1726882440.29386: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882440.2606819-14771-90377820002549/AnsiballZ_systemd.py 13531 1726882440.29689: Sending initial data 13531 1726882440.29692: Sent initial data (155 bytes) 13531 1726882440.30282: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882440.30530: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882440.30538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882440.30541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882440.30543: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882440.30545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882440.30547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882440.30549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882440.30552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882440.30628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882440.32367: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882440.32468: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882440.32569: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmp2e4m4xou /root/.ansible/tmp/ansible-tmp-1726882440.2606819-14771-90377820002549/AnsiballZ_systemd.py <<< 13531 1726882440.32668: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882440.35479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882440.35581: stderr chunk (state=3): >>><<< 13531 1726882440.35585: stdout chunk (state=3): >>><<< 13531 1726882440.35604: done transferring module to remote 13531 1726882440.35614: _low_level_execute_command(): starting 13531 1726882440.35618: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882440.2606819-14771-90377820002549/ /root/.ansible/tmp/ansible-tmp-1726882440.2606819-14771-90377820002549/AnsiballZ_systemd.py && sleep 0' 13531 1726882440.36036: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882440.36043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882440.36077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882440.36083: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882440.36092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882440.36097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882440.36104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882440.36110: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882440.36115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882440.36186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882440.36190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882440.36194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882440.36290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882440.38078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882440.38284: stderr chunk (state=3): >>><<< 13531 1726882440.38287: stdout chunk (state=3): >>><<< 13531 1726882440.38362: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882440.38368: _low_level_execute_command(): starting 13531 1726882440.38371: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882440.2606819-14771-90377820002549/AnsiballZ_systemd.py && sleep 0' 13531 1726882440.38897: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882440.38911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882440.38924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882440.38940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882440.38983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882440.38995: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882440.39008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882440.39024: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882440.39035: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882440.39044: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882440.39055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882440.39079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882440.39096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882440.39107: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882440.39117: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882440.39130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882440.39206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882440.39227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882440.39241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882440.39375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882440.64342: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "8953856", "MemoryAvailable": "infinity", "CPUUsageNSec": "888369000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft":<<< 13531 1726882440.64362: stdout chunk (state=3): >>> "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13531 1726882440.65848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882440.65907: stderr chunk (state=3): >>><<< 13531 1726882440.65910: stdout chunk (state=3): >>><<< 13531 1726882440.65924: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "8953856", "MemoryAvailable": "infinity", "CPUUsageNSec": "888369000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882440.66035: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882440.2606819-14771-90377820002549/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882440.66049: _low_level_execute_command(): starting 13531 1726882440.66056: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882440.2606819-14771-90377820002549/ > /dev/null 2>&1 && sleep 0' 13531 1726882440.66519: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882440.66526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882440.66548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882440.66562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882440.66608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882440.66620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882440.66633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882440.66775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882440.68536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882440.68584: stderr chunk (state=3): >>><<< 13531 1726882440.68587: stdout chunk (state=3): >>><<< 13531 1726882440.68600: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882440.68607: handler run complete 13531 1726882440.68644: attempt loop complete, returning result 13531 1726882440.68647: _execute() done 13531 1726882440.68649: dumping result to json 13531 1726882440.68662: done dumping result, returning 13531 1726882440.68673: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4fd9-519d-000000000087] 13531 1726882440.68679: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000087 13531 1726882440.68905: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000087 13531 1726882440.68908: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882440.68965: no more pending results, returning what we have 13531 1726882440.68969: results queue empty 13531 1726882440.68969: checking for any_errors_fatal 13531 1726882440.68976: done checking for any_errors_fatal 13531 1726882440.68976: checking for max_fail_percentage 13531 1726882440.68978: done checking for max_fail_percentage 13531 1726882440.68979: checking to see if all hosts have failed and the running result is not ok 13531 1726882440.68979: done checking to see if all hosts have failed 13531 1726882440.68980: getting the remaining hosts for this loop 13531 1726882440.68981: done getting the remaining hosts for this loop 13531 1726882440.68984: getting the next task for host managed_node2 13531 1726882440.68990: done getting next task for host managed_node2 13531 1726882440.68993: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13531 1726882440.68996: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882440.69006: getting variables 13531 1726882440.69007: in VariableManager get_vars() 13531 1726882440.69052: Calling all_inventory to load vars for managed_node2 13531 1726882440.69057: Calling groups_inventory to load vars for managed_node2 13531 1726882440.69060: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882440.69073: Calling all_plugins_play to load vars for managed_node2 13531 1726882440.69075: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882440.69078: Calling groups_plugins_play to load vars for managed_node2 13531 1726882440.70276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882440.71767: done with get_vars() 13531 1726882440.71786: done getting variables 13531 1726882440.71828: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:34:00 -0400 (0:00:00.605) 0:00:28.613 ****** 13531 1726882440.71855: entering _queue_task() for managed_node2/service 13531 1726882440.72082: worker is 1 (out of 1 available) 13531 1726882440.72097: exiting _queue_task() for managed_node2/service 13531 1726882440.72111: done queuing things up, now waiting for results queue to drain 13531 1726882440.72112: waiting for pending results... 13531 1726882440.72308: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13531 1726882440.72399: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000088 13531 1726882440.72413: variable 'ansible_search_path' from source: unknown 13531 1726882440.72416: variable 'ansible_search_path' from source: unknown 13531 1726882440.72449: calling self._execute() 13531 1726882440.72531: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882440.72535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882440.72543: variable 'omit' from source: magic vars 13531 1726882440.72831: variable 'ansible_distribution_major_version' from source: facts 13531 1726882440.72841: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882440.72921: variable 'network_provider' from source: set_fact 13531 1726882440.72927: Evaluated conditional (network_provider == "nm"): True 13531 1726882440.72992: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882440.73057: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882440.73190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882440.75530: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882440.75601: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882440.75636: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882440.75670: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882440.75701: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882440.75781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882440.75813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882440.75847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.75886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882440.75903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882440.75951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882440.75976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882440.76002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.76047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882440.76060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882440.76100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882440.76124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882440.76169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.76207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882440.76218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882440.76336: variable 'network_connections' from source: task vars 13531 1726882440.76346: variable 'controller_profile' from source: play vars 13531 1726882440.76400: variable 'controller_profile' from source: play vars 13531 1726882440.76471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882440.76577: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882440.76605: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882440.76627: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882440.76647: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882440.76685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882440.76702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882440.76719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.76736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882440.76776: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882440.76932: variable 'network_connections' from source: task vars 13531 1726882440.76936: variable 'controller_profile' from source: play vars 13531 1726882440.76983: variable 'controller_profile' from source: play vars 13531 1726882440.77007: Evaluated conditional (__network_wpa_supplicant_required): False 13531 1726882440.77012: when evaluation is False, skipping this task 13531 1726882440.77015: _execute() done 13531 1726882440.77017: dumping result to json 13531 1726882440.77019: done dumping result, returning 13531 1726882440.77022: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4fd9-519d-000000000088] 13531 1726882440.77031: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000088 13531 1726882440.77117: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000088 13531 1726882440.77120: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13531 1726882440.77185: no more pending results, returning what we have 13531 1726882440.77189: results queue empty 13531 1726882440.77190: checking for any_errors_fatal 13531 1726882440.77211: done checking for any_errors_fatal 13531 1726882440.77212: checking for max_fail_percentage 13531 1726882440.77213: done checking for max_fail_percentage 13531 1726882440.77214: checking to see if all hosts have failed and the running result is not ok 13531 1726882440.77215: done checking to see if all hosts have failed 13531 1726882440.77216: getting the remaining hosts for this loop 13531 1726882440.77217: done getting the remaining hosts for this loop 13531 1726882440.77220: getting the next task for host managed_node2 13531 1726882440.77226: done getting next task for host managed_node2 13531 1726882440.77232: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13531 1726882440.77235: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882440.77251: getting variables 13531 1726882440.77253: in VariableManager get_vars() 13531 1726882440.77301: Calling all_inventory to load vars for managed_node2 13531 1726882440.77304: Calling groups_inventory to load vars for managed_node2 13531 1726882440.77306: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882440.77316: Calling all_plugins_play to load vars for managed_node2 13531 1726882440.77318: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882440.77321: Calling groups_plugins_play to load vars for managed_node2 13531 1726882440.78485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882440.79807: done with get_vars() 13531 1726882440.79826: done getting variables 13531 1726882440.79872: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:34:00 -0400 (0:00:00.080) 0:00:28.694 ****** 13531 1726882440.79898: entering _queue_task() for managed_node2/service 13531 1726882440.80129: worker is 1 (out of 1 available) 13531 1726882440.80143: exiting _queue_task() for managed_node2/service 13531 1726882440.80156: done queuing things up, now waiting for results queue to drain 13531 1726882440.80158: waiting for pending results... 13531 1726882440.80353: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 13531 1726882440.80447: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000089 13531 1726882440.80461: variable 'ansible_search_path' from source: unknown 13531 1726882440.80466: variable 'ansible_search_path' from source: unknown 13531 1726882440.80499: calling self._execute() 13531 1726882440.80579: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882440.80586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882440.80593: variable 'omit' from source: magic vars 13531 1726882440.80878: variable 'ansible_distribution_major_version' from source: facts 13531 1726882440.80889: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882440.80969: variable 'network_provider' from source: set_fact 13531 1726882440.80978: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882440.80980: when evaluation is False, skipping this task 13531 1726882440.80983: _execute() done 13531 1726882440.80985: dumping result to json 13531 1726882440.80988: done dumping result, returning 13531 1726882440.80993: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4fd9-519d-000000000089] 13531 1726882440.80999: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000089 13531 1726882440.81087: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000089 13531 1726882440.81091: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882440.81136: no more pending results, returning what we have 13531 1726882440.81139: results queue empty 13531 1726882440.81140: checking for any_errors_fatal 13531 1726882440.81147: done checking for any_errors_fatal 13531 1726882440.81148: checking for max_fail_percentage 13531 1726882440.81150: done checking for max_fail_percentage 13531 1726882440.81151: checking to see if all hosts have failed and the running result is not ok 13531 1726882440.81152: done checking to see if all hosts have failed 13531 1726882440.81152: getting the remaining hosts for this loop 13531 1726882440.81154: done getting the remaining hosts for this loop 13531 1726882440.81157: getting the next task for host managed_node2 13531 1726882440.81162: done getting next task for host managed_node2 13531 1726882440.81168: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13531 1726882440.81171: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882440.81189: getting variables 13531 1726882440.81191: in VariableManager get_vars() 13531 1726882440.81243: Calling all_inventory to load vars for managed_node2 13531 1726882440.81246: Calling groups_inventory to load vars for managed_node2 13531 1726882440.81248: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882440.81257: Calling all_plugins_play to load vars for managed_node2 13531 1726882440.81259: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882440.81262: Calling groups_plugins_play to load vars for managed_node2 13531 1726882440.82282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882440.83330: done with get_vars() 13531 1726882440.83345: done getting variables 13531 1726882440.83390: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:34:00 -0400 (0:00:00.035) 0:00:28.729 ****** 13531 1726882440.83416: entering _queue_task() for managed_node2/copy 13531 1726882440.83721: worker is 1 (out of 1 available) 13531 1726882440.83732: exiting _queue_task() for managed_node2/copy 13531 1726882440.83743: done queuing things up, now waiting for results queue to drain 13531 1726882440.83745: waiting for pending results... 13531 1726882440.84054: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13531 1726882440.84176: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000008a 13531 1726882440.84192: variable 'ansible_search_path' from source: unknown 13531 1726882440.84195: variable 'ansible_search_path' from source: unknown 13531 1726882440.84229: calling self._execute() 13531 1726882440.84328: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882440.84332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882440.84341: variable 'omit' from source: magic vars 13531 1726882440.84716: variable 'ansible_distribution_major_version' from source: facts 13531 1726882440.84729: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882440.84850: variable 'network_provider' from source: set_fact 13531 1726882440.84856: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882440.84862: when evaluation is False, skipping this task 13531 1726882440.84866: _execute() done 13531 1726882440.84869: dumping result to json 13531 1726882440.84872: done dumping result, returning 13531 1726882440.84882: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4fd9-519d-00000000008a] 13531 1726882440.84887: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000008a skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13531 1726882440.85029: no more pending results, returning what we have 13531 1726882440.85033: results queue empty 13531 1726882440.85034: checking for any_errors_fatal 13531 1726882440.85042: done checking for any_errors_fatal 13531 1726882440.85043: checking for max_fail_percentage 13531 1726882440.85044: done checking for max_fail_percentage 13531 1726882440.85045: checking to see if all hosts have failed and the running result is not ok 13531 1726882440.85046: done checking to see if all hosts have failed 13531 1726882440.85047: getting the remaining hosts for this loop 13531 1726882440.85048: done getting the remaining hosts for this loop 13531 1726882440.85051: getting the next task for host managed_node2 13531 1726882440.85061: done getting next task for host managed_node2 13531 1726882440.85067: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13531 1726882440.85071: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882440.85082: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000008a 13531 1726882440.85086: WORKER PROCESS EXITING 13531 1726882440.85099: getting variables 13531 1726882440.85102: in VariableManager get_vars() 13531 1726882440.85159: Calling all_inventory to load vars for managed_node2 13531 1726882440.85162: Calling groups_inventory to load vars for managed_node2 13531 1726882440.85166: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882440.85180: Calling all_plugins_play to load vars for managed_node2 13531 1726882440.85183: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882440.85187: Calling groups_plugins_play to load vars for managed_node2 13531 1726882440.86803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882440.88158: done with get_vars() 13531 1726882440.88175: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:34:00 -0400 (0:00:00.048) 0:00:28.777 ****** 13531 1726882440.88233: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 13531 1726882440.88448: worker is 1 (out of 1 available) 13531 1726882440.88465: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 13531 1726882440.88477: done queuing things up, now waiting for results queue to drain 13531 1726882440.88479: waiting for pending results... 13531 1726882440.88667: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13531 1726882440.88757: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000008b 13531 1726882440.88767: variable 'ansible_search_path' from source: unknown 13531 1726882440.88771: variable 'ansible_search_path' from source: unknown 13531 1726882440.88802: calling self._execute() 13531 1726882440.88877: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882440.88881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882440.88889: variable 'omit' from source: magic vars 13531 1726882440.89166: variable 'ansible_distribution_major_version' from source: facts 13531 1726882440.89177: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882440.89183: variable 'omit' from source: magic vars 13531 1726882440.89219: variable 'omit' from source: magic vars 13531 1726882440.89332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882440.92044: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882440.92128: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882440.92172: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882440.92218: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882440.92248: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882440.92341: variable 'network_provider' from source: set_fact 13531 1726882440.92489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882440.92530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882440.92565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882440.92611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882440.92639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882440.92722: variable 'omit' from source: magic vars 13531 1726882440.92861: variable 'omit' from source: magic vars 13531 1726882440.92979: variable 'network_connections' from source: task vars 13531 1726882440.92993: variable 'controller_profile' from source: play vars 13531 1726882440.93072: variable 'controller_profile' from source: play vars 13531 1726882440.93227: variable 'omit' from source: magic vars 13531 1726882440.93239: variable '__lsr_ansible_managed' from source: task vars 13531 1726882440.93314: variable '__lsr_ansible_managed' from source: task vars 13531 1726882440.93537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13531 1726882440.93781: Loaded config def from plugin (lookup/template) 13531 1726882440.93790: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13531 1726882440.93819: File lookup term: get_ansible_managed.j2 13531 1726882440.93830: variable 'ansible_search_path' from source: unknown 13531 1726882440.93843: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13531 1726882440.93865: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13531 1726882440.93888: variable 'ansible_search_path' from source: unknown 13531 1726882441.00941: variable 'ansible_managed' from source: unknown 13531 1726882441.01111: variable 'omit' from source: magic vars 13531 1726882441.01158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882441.01191: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882441.01214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882441.01248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882441.01265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882441.01298: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882441.01305: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882441.01313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882441.01424: Set connection var ansible_pipelining to False 13531 1726882441.01435: Set connection var ansible_timeout to 10 13531 1726882441.01443: Set connection var ansible_shell_executable to /bin/sh 13531 1726882441.01460: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882441.01471: Set connection var ansible_connection to ssh 13531 1726882441.01478: Set connection var ansible_shell_type to sh 13531 1726882441.01508: variable 'ansible_shell_executable' from source: unknown 13531 1726882441.01515: variable 'ansible_connection' from source: unknown 13531 1726882441.01521: variable 'ansible_module_compression' from source: unknown 13531 1726882441.01526: variable 'ansible_shell_type' from source: unknown 13531 1726882441.01531: variable 'ansible_shell_executable' from source: unknown 13531 1726882441.01537: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882441.01543: variable 'ansible_pipelining' from source: unknown 13531 1726882441.01548: variable 'ansible_timeout' from source: unknown 13531 1726882441.01562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882441.01695: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882441.01718: variable 'omit' from source: magic vars 13531 1726882441.01728: starting attempt loop 13531 1726882441.01736: running the handler 13531 1726882441.01759: _low_level_execute_command(): starting 13531 1726882441.01777: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882441.02570: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882441.02585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.02599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.02616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882441.02667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882441.02683: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882441.02696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.02714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882441.02728: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882441.02740: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882441.02751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.02771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.02789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882441.02802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882441.02812: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882441.02824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.02910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882441.02926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882441.02939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882441.03122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882441.04791: stdout chunk (state=3): >>>/root <<< 13531 1726882441.04991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882441.04994: stdout chunk (state=3): >>><<< 13531 1726882441.04996: stderr chunk (state=3): >>><<< 13531 1726882441.05109: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882441.05113: _low_level_execute_command(): starting 13531 1726882441.05116: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882441.0501726-14807-236848477353747 `" && echo ansible-tmp-1726882441.0501726-14807-236848477353747="` echo /root/.ansible/tmp/ansible-tmp-1726882441.0501726-14807-236848477353747 `" ) && sleep 0' 13531 1726882441.05732: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882441.05747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.05768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.05792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882441.05835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882441.05848: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882441.05869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.05890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882441.05906: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882441.05918: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882441.05932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.05946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.05968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882441.05980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882441.05991: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882441.06009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.06087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882441.06104: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882441.06121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882441.06275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882441.08171: stdout chunk (state=3): >>>ansible-tmp-1726882441.0501726-14807-236848477353747=/root/.ansible/tmp/ansible-tmp-1726882441.0501726-14807-236848477353747 <<< 13531 1726882441.08345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882441.08349: stdout chunk (state=3): >>><<< 13531 1726882441.08358: stderr chunk (state=3): >>><<< 13531 1726882441.08376: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882441.0501726-14807-236848477353747=/root/.ansible/tmp/ansible-tmp-1726882441.0501726-14807-236848477353747 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882441.08422: variable 'ansible_module_compression' from source: unknown 13531 1726882441.08467: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13531 1726882441.08513: variable 'ansible_facts' from source: unknown 13531 1726882441.08616: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882441.0501726-14807-236848477353747/AnsiballZ_network_connections.py 13531 1726882441.08759: Sending initial data 13531 1726882441.08763: Sent initial data (168 bytes) 13531 1726882441.09656: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882441.09662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.09677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.09692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882441.09728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882441.09735: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882441.09745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.09759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882441.09766: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882441.09781: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882441.09788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.09797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.09809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882441.09815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882441.09821: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882441.09830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.09897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882441.09913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882441.09922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882441.10057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882441.11837: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882441.11933: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882441.12033: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpw6gd816k /root/.ansible/tmp/ansible-tmp-1726882441.0501726-14807-236848477353747/AnsiballZ_network_connections.py <<< 13531 1726882441.12131: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882441.13892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882441.14231: stderr chunk (state=3): >>><<< 13531 1726882441.14234: stdout chunk (state=3): >>><<< 13531 1726882441.14235: done transferring module to remote 13531 1726882441.14237: _low_level_execute_command(): starting 13531 1726882441.14239: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882441.0501726-14807-236848477353747/ /root/.ansible/tmp/ansible-tmp-1726882441.0501726-14807-236848477353747/AnsiballZ_network_connections.py && sleep 0' 13531 1726882441.15094: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882441.15103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.15114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.15128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882441.15180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882441.15187: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882441.15197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.15211: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882441.15218: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882441.15225: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882441.15232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.15241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.15268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882441.15275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882441.15282: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882441.15292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.15377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882441.15396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882441.15408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882441.15536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882441.17451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882441.17454: stdout chunk (state=3): >>><<< 13531 1726882441.17457: stderr chunk (state=3): >>><<< 13531 1726882441.17470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882441.17477: _low_level_execute_command(): starting 13531 1726882441.17479: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882441.0501726-14807-236848477353747/AnsiballZ_network_connections.py && sleep 0' 13531 1726882441.18091: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.18096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882441.18139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882441.18144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882441.18155: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.18166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.18170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882441.18183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.18252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882441.18260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882441.18275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882441.18404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882441.57203: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__ntwiklm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__ntwiklm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/ed834b6c-cbda-46ef-ae08-dad7f6819810: error=unknown <<< 13531 1726882441.58016: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13531 1726882441.60476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882441.60480: stdout chunk (state=3): >>><<< 13531 1726882441.60483: stderr chunk (state=3): >>><<< 13531 1726882441.60570: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__ntwiklm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__ntwiklm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/ed834b6c-cbda-46ef-ae08-dad7f6819810: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882441.60574: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882441.0501726-14807-236848477353747/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882441.60576: _low_level_execute_command(): starting 13531 1726882441.60579: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882441.0501726-14807-236848477353747/ > /dev/null 2>&1 && sleep 0' 13531 1726882441.61195: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882441.61208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.61221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.61237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882441.61281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882441.61297: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882441.61309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.61325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882441.61336: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882441.61346: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882441.61356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.61373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.61388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882441.61398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882441.61408: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882441.61421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.61499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882441.61515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882441.61528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882441.61660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882441.63571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882441.63574: stdout chunk (state=3): >>><<< 13531 1726882441.63576: stderr chunk (state=3): >>><<< 13531 1726882441.63771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882441.63775: handler run complete 13531 1726882441.63777: attempt loop complete, returning result 13531 1726882441.63779: _execute() done 13531 1726882441.63781: dumping result to json 13531 1726882441.63783: done dumping result, returning 13531 1726882441.63785: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4fd9-519d-00000000008b] 13531 1726882441.63787: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000008b 13531 1726882441.63869: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000008b 13531 1726882441.63872: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13531 1726882441.63976: no more pending results, returning what we have 13531 1726882441.63979: results queue empty 13531 1726882441.63980: checking for any_errors_fatal 13531 1726882441.63987: done checking for any_errors_fatal 13531 1726882441.63988: checking for max_fail_percentage 13531 1726882441.63990: done checking for max_fail_percentage 13531 1726882441.63991: checking to see if all hosts have failed and the running result is not ok 13531 1726882441.63991: done checking to see if all hosts have failed 13531 1726882441.63992: getting the remaining hosts for this loop 13531 1726882441.63994: done getting the remaining hosts for this loop 13531 1726882441.63997: getting the next task for host managed_node2 13531 1726882441.64003: done getting next task for host managed_node2 13531 1726882441.64007: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13531 1726882441.64010: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882441.64020: getting variables 13531 1726882441.64022: in VariableManager get_vars() 13531 1726882441.64079: Calling all_inventory to load vars for managed_node2 13531 1726882441.64082: Calling groups_inventory to load vars for managed_node2 13531 1726882441.64085: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882441.64096: Calling all_plugins_play to load vars for managed_node2 13531 1726882441.64098: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882441.64101: Calling groups_plugins_play to load vars for managed_node2 13531 1726882441.65972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882441.67696: done with get_vars() 13531 1726882441.67728: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:34:01 -0400 (0:00:00.796) 0:00:29.573 ****** 13531 1726882441.67840: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 13531 1726882441.68201: worker is 1 (out of 1 available) 13531 1726882441.68214: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 13531 1726882441.68227: done queuing things up, now waiting for results queue to drain 13531 1726882441.68228: waiting for pending results... 13531 1726882441.68537: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 13531 1726882441.68694: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000008c 13531 1726882441.68715: variable 'ansible_search_path' from source: unknown 13531 1726882441.68723: variable 'ansible_search_path' from source: unknown 13531 1726882441.68768: calling self._execute() 13531 1726882441.68876: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882441.68893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882441.68909: variable 'omit' from source: magic vars 13531 1726882441.69292: variable 'ansible_distribution_major_version' from source: facts 13531 1726882441.69311: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882441.69442: variable 'network_state' from source: role '' defaults 13531 1726882441.69458: Evaluated conditional (network_state != {}): False 13531 1726882441.69470: when evaluation is False, skipping this task 13531 1726882441.69479: _execute() done 13531 1726882441.69486: dumping result to json 13531 1726882441.69494: done dumping result, returning 13531 1726882441.69504: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4fd9-519d-00000000008c] 13531 1726882441.69515: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000008c 13531 1726882441.69630: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000008c 13531 1726882441.69637: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882441.69702: no more pending results, returning what we have 13531 1726882441.69707: results queue empty 13531 1726882441.69708: checking for any_errors_fatal 13531 1726882441.69721: done checking for any_errors_fatal 13531 1726882441.69722: checking for max_fail_percentage 13531 1726882441.69724: done checking for max_fail_percentage 13531 1726882441.69725: checking to see if all hosts have failed and the running result is not ok 13531 1726882441.69726: done checking to see if all hosts have failed 13531 1726882441.69727: getting the remaining hosts for this loop 13531 1726882441.69728: done getting the remaining hosts for this loop 13531 1726882441.69732: getting the next task for host managed_node2 13531 1726882441.69739: done getting next task for host managed_node2 13531 1726882441.69744: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13531 1726882441.69748: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882441.69769: getting variables 13531 1726882441.69772: in VariableManager get_vars() 13531 1726882441.69830: Calling all_inventory to load vars for managed_node2 13531 1726882441.69833: Calling groups_inventory to load vars for managed_node2 13531 1726882441.69836: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882441.69848: Calling all_plugins_play to load vars for managed_node2 13531 1726882441.69851: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882441.69854: Calling groups_plugins_play to load vars for managed_node2 13531 1726882441.71518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882441.73445: done with get_vars() 13531 1726882441.73480: done getting variables 13531 1726882441.73558: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:34:01 -0400 (0:00:00.057) 0:00:29.631 ****** 13531 1726882441.73594: entering _queue_task() for managed_node2/debug 13531 1726882441.74063: worker is 1 (out of 1 available) 13531 1726882441.74078: exiting _queue_task() for managed_node2/debug 13531 1726882441.74090: done queuing things up, now waiting for results queue to drain 13531 1726882441.74091: waiting for pending results... 13531 1726882441.74393: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13531 1726882441.74529: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000008d 13531 1726882441.74549: variable 'ansible_search_path' from source: unknown 13531 1726882441.74556: variable 'ansible_search_path' from source: unknown 13531 1726882441.74598: calling self._execute() 13531 1726882441.74704: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882441.74716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882441.74729: variable 'omit' from source: magic vars 13531 1726882441.75094: variable 'ansible_distribution_major_version' from source: facts 13531 1726882441.75111: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882441.75121: variable 'omit' from source: magic vars 13531 1726882441.75180: variable 'omit' from source: magic vars 13531 1726882441.75225: variable 'omit' from source: magic vars 13531 1726882441.75275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882441.75321: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882441.75349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882441.75375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882441.75393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882441.75433: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882441.75443: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882441.75451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882441.75566: Set connection var ansible_pipelining to False 13531 1726882441.75577: Set connection var ansible_timeout to 10 13531 1726882441.75586: Set connection var ansible_shell_executable to /bin/sh 13531 1726882441.75595: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882441.75601: Set connection var ansible_connection to ssh 13531 1726882441.75606: Set connection var ansible_shell_type to sh 13531 1726882441.75640: variable 'ansible_shell_executable' from source: unknown 13531 1726882441.75648: variable 'ansible_connection' from source: unknown 13531 1726882441.75654: variable 'ansible_module_compression' from source: unknown 13531 1726882441.75662: variable 'ansible_shell_type' from source: unknown 13531 1726882441.75671: variable 'ansible_shell_executable' from source: unknown 13531 1726882441.75677: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882441.75683: variable 'ansible_pipelining' from source: unknown 13531 1726882441.75689: variable 'ansible_timeout' from source: unknown 13531 1726882441.75696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882441.75837: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882441.75858: variable 'omit' from source: magic vars 13531 1726882441.75873: starting attempt loop 13531 1726882441.75879: running the handler 13531 1726882441.76013: variable '__network_connections_result' from source: set_fact 13531 1726882441.76073: handler run complete 13531 1726882441.76095: attempt loop complete, returning result 13531 1726882441.76101: _execute() done 13531 1726882441.76107: dumping result to json 13531 1726882441.76113: done dumping result, returning 13531 1726882441.76126: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4fd9-519d-00000000008d] 13531 1726882441.76136: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000008d ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 13531 1726882441.76297: no more pending results, returning what we have 13531 1726882441.76301: results queue empty 13531 1726882441.76302: checking for any_errors_fatal 13531 1726882441.76309: done checking for any_errors_fatal 13531 1726882441.76310: checking for max_fail_percentage 13531 1726882441.76312: done checking for max_fail_percentage 13531 1726882441.76313: checking to see if all hosts have failed and the running result is not ok 13531 1726882441.76314: done checking to see if all hosts have failed 13531 1726882441.76315: getting the remaining hosts for this loop 13531 1726882441.76316: done getting the remaining hosts for this loop 13531 1726882441.76320: getting the next task for host managed_node2 13531 1726882441.76327: done getting next task for host managed_node2 13531 1726882441.76330: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13531 1726882441.76334: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882441.76345: getting variables 13531 1726882441.76347: in VariableManager get_vars() 13531 1726882441.76404: Calling all_inventory to load vars for managed_node2 13531 1726882441.76408: Calling groups_inventory to load vars for managed_node2 13531 1726882441.76410: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882441.76421: Calling all_plugins_play to load vars for managed_node2 13531 1726882441.76424: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882441.76427: Calling groups_plugins_play to load vars for managed_node2 13531 1726882441.77386: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000008d 13531 1726882441.77390: WORKER PROCESS EXITING 13531 1726882441.78232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882441.80113: done with get_vars() 13531 1726882441.80146: done getting variables 13531 1726882441.80210: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:34:01 -0400 (0:00:00.066) 0:00:29.697 ****** 13531 1726882441.80244: entering _queue_task() for managed_node2/debug 13531 1726882441.80609: worker is 1 (out of 1 available) 13531 1726882441.80623: exiting _queue_task() for managed_node2/debug 13531 1726882441.80635: done queuing things up, now waiting for results queue to drain 13531 1726882441.80637: waiting for pending results... 13531 1726882441.80944: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13531 1726882441.81097: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000008e 13531 1726882441.81116: variable 'ansible_search_path' from source: unknown 13531 1726882441.81124: variable 'ansible_search_path' from source: unknown 13531 1726882441.81170: calling self._execute() 13531 1726882441.81270: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882441.81282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882441.81302: variable 'omit' from source: magic vars 13531 1726882441.81669: variable 'ansible_distribution_major_version' from source: facts 13531 1726882441.81687: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882441.81696: variable 'omit' from source: magic vars 13531 1726882441.81761: variable 'omit' from source: magic vars 13531 1726882441.81803: variable 'omit' from source: magic vars 13531 1726882441.81853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882441.81894: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882441.81918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882441.81938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882441.81957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882441.81989: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882441.81998: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882441.82004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882441.82115: Set connection var ansible_pipelining to False 13531 1726882441.82126: Set connection var ansible_timeout to 10 13531 1726882441.82135: Set connection var ansible_shell_executable to /bin/sh 13531 1726882441.82143: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882441.82148: Set connection var ansible_connection to ssh 13531 1726882441.82154: Set connection var ansible_shell_type to sh 13531 1726882441.82187: variable 'ansible_shell_executable' from source: unknown 13531 1726882441.82193: variable 'ansible_connection' from source: unknown 13531 1726882441.82199: variable 'ansible_module_compression' from source: unknown 13531 1726882441.82204: variable 'ansible_shell_type' from source: unknown 13531 1726882441.82208: variable 'ansible_shell_executable' from source: unknown 13531 1726882441.82214: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882441.82220: variable 'ansible_pipelining' from source: unknown 13531 1726882441.82225: variable 'ansible_timeout' from source: unknown 13531 1726882441.82230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882441.82371: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882441.82393: variable 'omit' from source: magic vars 13531 1726882441.82403: starting attempt loop 13531 1726882441.82410: running the handler 13531 1726882441.82463: variable '__network_connections_result' from source: set_fact 13531 1726882441.82553: variable '__network_connections_result' from source: set_fact 13531 1726882441.82672: handler run complete 13531 1726882441.82707: attempt loop complete, returning result 13531 1726882441.82715: _execute() done 13531 1726882441.82722: dumping result to json 13531 1726882441.82730: done dumping result, returning 13531 1726882441.82743: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4fd9-519d-00000000008e] 13531 1726882441.82754: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000008e ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13531 1726882441.82961: no more pending results, returning what we have 13531 1726882441.82967: results queue empty 13531 1726882441.82968: checking for any_errors_fatal 13531 1726882441.82980: done checking for any_errors_fatal 13531 1726882441.82981: checking for max_fail_percentage 13531 1726882441.82982: done checking for max_fail_percentage 13531 1726882441.82983: checking to see if all hosts have failed and the running result is not ok 13531 1726882441.82984: done checking to see if all hosts have failed 13531 1726882441.82985: getting the remaining hosts for this loop 13531 1726882441.82987: done getting the remaining hosts for this loop 13531 1726882441.82991: getting the next task for host managed_node2 13531 1726882441.82998: done getting next task for host managed_node2 13531 1726882441.83002: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13531 1726882441.83006: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882441.83018: getting variables 13531 1726882441.83020: in VariableManager get_vars() 13531 1726882441.83081: Calling all_inventory to load vars for managed_node2 13531 1726882441.83084: Calling groups_inventory to load vars for managed_node2 13531 1726882441.83087: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882441.83099: Calling all_plugins_play to load vars for managed_node2 13531 1726882441.83103: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882441.83106: Calling groups_plugins_play to load vars for managed_node2 13531 1726882441.84139: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000008e 13531 1726882441.84143: WORKER PROCESS EXITING 13531 1726882441.85237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882441.87093: done with get_vars() 13531 1726882441.87122: done getting variables 13531 1726882441.87186: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:34:01 -0400 (0:00:00.069) 0:00:29.767 ****** 13531 1726882441.87227: entering _queue_task() for managed_node2/debug 13531 1726882441.87604: worker is 1 (out of 1 available) 13531 1726882441.87614: exiting _queue_task() for managed_node2/debug 13531 1726882441.87626: done queuing things up, now waiting for results queue to drain 13531 1726882441.87628: waiting for pending results... 13531 1726882441.87945: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13531 1726882441.88102: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000008f 13531 1726882441.88121: variable 'ansible_search_path' from source: unknown 13531 1726882441.88129: variable 'ansible_search_path' from source: unknown 13531 1726882441.88176: calling self._execute() 13531 1726882441.88282: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882441.88299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882441.88314: variable 'omit' from source: magic vars 13531 1726882441.88747: variable 'ansible_distribution_major_version' from source: facts 13531 1726882441.88774: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882441.88902: variable 'network_state' from source: role '' defaults 13531 1726882441.88918: Evaluated conditional (network_state != {}): False 13531 1726882441.88925: when evaluation is False, skipping this task 13531 1726882441.88943: _execute() done 13531 1726882441.88966: dumping result to json 13531 1726882441.88975: done dumping result, returning 13531 1726882441.88987: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4fd9-519d-00000000008f] 13531 1726882441.88999: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000008f skipping: [managed_node2] => { "false_condition": "network_state != {}" } 13531 1726882441.89153: no more pending results, returning what we have 13531 1726882441.89157: results queue empty 13531 1726882441.89158: checking for any_errors_fatal 13531 1726882441.89170: done checking for any_errors_fatal 13531 1726882441.89172: checking for max_fail_percentage 13531 1726882441.89174: done checking for max_fail_percentage 13531 1726882441.89175: checking to see if all hosts have failed and the running result is not ok 13531 1726882441.89176: done checking to see if all hosts have failed 13531 1726882441.89177: getting the remaining hosts for this loop 13531 1726882441.89178: done getting the remaining hosts for this loop 13531 1726882441.89182: getting the next task for host managed_node2 13531 1726882441.89188: done getting next task for host managed_node2 13531 1726882441.89193: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13531 1726882441.89197: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882441.89218: getting variables 13531 1726882441.89220: in VariableManager get_vars() 13531 1726882441.89281: Calling all_inventory to load vars for managed_node2 13531 1726882441.89285: Calling groups_inventory to load vars for managed_node2 13531 1726882441.89287: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882441.89301: Calling all_plugins_play to load vars for managed_node2 13531 1726882441.89304: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882441.89308: Calling groups_plugins_play to load vars for managed_node2 13531 1726882441.90984: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000008f 13531 1726882441.90988: WORKER PROCESS EXITING 13531 1726882441.91170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882441.92467: done with get_vars() 13531 1726882441.92487: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:34:01 -0400 (0:00:00.053) 0:00:29.820 ****** 13531 1726882441.92559: entering _queue_task() for managed_node2/ping 13531 1726882441.92789: worker is 1 (out of 1 available) 13531 1726882441.92802: exiting _queue_task() for managed_node2/ping 13531 1726882441.92813: done queuing things up, now waiting for results queue to drain 13531 1726882441.92815: waiting for pending results... 13531 1726882441.93006: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 13531 1726882441.93102: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000090 13531 1726882441.93113: variable 'ansible_search_path' from source: unknown 13531 1726882441.93118: variable 'ansible_search_path' from source: unknown 13531 1726882441.93150: calling self._execute() 13531 1726882441.93224: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882441.93229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882441.93234: variable 'omit' from source: magic vars 13531 1726882441.93507: variable 'ansible_distribution_major_version' from source: facts 13531 1726882441.93516: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882441.93526: variable 'omit' from source: magic vars 13531 1726882441.93583: variable 'omit' from source: magic vars 13531 1726882441.93618: variable 'omit' from source: magic vars 13531 1726882441.93765: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882441.93768: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882441.93771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882441.93773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882441.93775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882441.93900: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882441.93903: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882441.93906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882441.93914: Set connection var ansible_pipelining to False 13531 1726882441.93921: Set connection var ansible_timeout to 10 13531 1726882441.93927: Set connection var ansible_shell_executable to /bin/sh 13531 1726882441.93932: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882441.93935: Set connection var ansible_connection to ssh 13531 1726882441.93937: Set connection var ansible_shell_type to sh 13531 1726882441.93972: variable 'ansible_shell_executable' from source: unknown 13531 1726882441.93975: variable 'ansible_connection' from source: unknown 13531 1726882441.93978: variable 'ansible_module_compression' from source: unknown 13531 1726882441.93981: variable 'ansible_shell_type' from source: unknown 13531 1726882441.93983: variable 'ansible_shell_executable' from source: unknown 13531 1726882441.93985: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882441.93987: variable 'ansible_pipelining' from source: unknown 13531 1726882441.93991: variable 'ansible_timeout' from source: unknown 13531 1726882441.93993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882441.94213: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882441.94220: variable 'omit' from source: magic vars 13531 1726882441.94226: starting attempt loop 13531 1726882441.94229: running the handler 13531 1726882441.94243: _low_level_execute_command(): starting 13531 1726882441.94251: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882441.94996: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.95003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.95033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882441.95077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.95080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.95082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882441.95084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.95139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882441.95159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882441.95283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882441.96939: stdout chunk (state=3): >>>/root <<< 13531 1726882441.97039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882441.97122: stderr chunk (state=3): >>><<< 13531 1726882441.97128: stdout chunk (state=3): >>><<< 13531 1726882441.97157: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882441.97176: _low_level_execute_command(): starting 13531 1726882441.97183: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882441.9715886-14839-55647591112086 `" && echo ansible-tmp-1726882441.9715886-14839-55647591112086="` echo /root/.ansible/tmp/ansible-tmp-1726882441.9715886-14839-55647591112086 `" ) && sleep 0' 13531 1726882441.97807: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882441.97815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.97825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.97840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882441.97883: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882441.97894: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882441.97901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.97913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882441.97922: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882441.97929: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882441.97937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882441.97946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882441.97960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882441.97969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882441.97977: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882441.97986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882441.98056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882441.98078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882441.98090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882441.98214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.00109: stdout chunk (state=3): >>>ansible-tmp-1726882441.9715886-14839-55647591112086=/root/.ansible/tmp/ansible-tmp-1726882441.9715886-14839-55647591112086 <<< 13531 1726882442.00214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882442.00272: stderr chunk (state=3): >>><<< 13531 1726882442.00277: stdout chunk (state=3): >>><<< 13531 1726882442.00299: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882441.9715886-14839-55647591112086=/root/.ansible/tmp/ansible-tmp-1726882441.9715886-14839-55647591112086 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882442.00337: variable 'ansible_module_compression' from source: unknown 13531 1726882442.00372: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13531 1726882442.00401: variable 'ansible_facts' from source: unknown 13531 1726882442.00453: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882441.9715886-14839-55647591112086/AnsiballZ_ping.py 13531 1726882442.00563: Sending initial data 13531 1726882442.00568: Sent initial data (152 bytes) 13531 1726882442.01474: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882442.01478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.01480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.01483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.01752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.01755: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882442.01758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.01760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882442.01762: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882442.01765: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882442.01767: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.01769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.01771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.01773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.01775: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882442.01777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.01779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882442.01781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882442.01782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.01884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.03657: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882442.03754: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882442.03854: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpq7vj_c3n /root/.ansible/tmp/ansible-tmp-1726882441.9715886-14839-55647591112086/AnsiballZ_ping.py <<< 13531 1726882442.03950: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882442.05065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882442.05190: stderr chunk (state=3): >>><<< 13531 1726882442.05194: stdout chunk (state=3): >>><<< 13531 1726882442.05197: done transferring module to remote 13531 1726882442.05205: _low_level_execute_command(): starting 13531 1726882442.05210: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882441.9715886-14839-55647591112086/ /root/.ansible/tmp/ansible-tmp-1726882441.9715886-14839-55647591112086/AnsiballZ_ping.py && sleep 0' 13531 1726882442.05650: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.05658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.05710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.05713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.05715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.05775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882442.05778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.05888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.07684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882442.07713: stderr chunk (state=3): >>><<< 13531 1726882442.07716: stdout chunk (state=3): >>><<< 13531 1726882442.07731: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882442.07733: _low_level_execute_command(): starting 13531 1726882442.07738: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882441.9715886-14839-55647591112086/AnsiballZ_ping.py && sleep 0' 13531 1726882442.08186: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.08192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.08218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.08231: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.08243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.08288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882442.08300: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.08415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.21403: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13531 1726882442.22492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882442.22496: stdout chunk (state=3): >>><<< 13531 1726882442.22503: stderr chunk (state=3): >>><<< 13531 1726882442.22522: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882442.22546: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882441.9715886-14839-55647591112086/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882442.22558: _low_level_execute_command(): starting 13531 1726882442.22560: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882441.9715886-14839-55647591112086/ > /dev/null 2>&1 && sleep 0' 13531 1726882442.23180: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882442.23191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.23198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.23213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.23252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.23257: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882442.23269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.23283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882442.23291: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882442.23300: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882442.23303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.23313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.23324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.23331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.23338: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882442.23347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.23417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882442.23436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882442.23450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.23593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.25402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882442.25453: stderr chunk (state=3): >>><<< 13531 1726882442.25457: stdout chunk (state=3): >>><<< 13531 1726882442.25475: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882442.25481: handler run complete 13531 1726882442.25494: attempt loop complete, returning result 13531 1726882442.25498: _execute() done 13531 1726882442.25500: dumping result to json 13531 1726882442.25502: done dumping result, returning 13531 1726882442.25510: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4fd9-519d-000000000090] 13531 1726882442.25515: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000090 13531 1726882442.25605: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000090 13531 1726882442.25607: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 13531 1726882442.25670: no more pending results, returning what we have 13531 1726882442.25674: results queue empty 13531 1726882442.25675: checking for any_errors_fatal 13531 1726882442.25682: done checking for any_errors_fatal 13531 1726882442.25683: checking for max_fail_percentage 13531 1726882442.25685: done checking for max_fail_percentage 13531 1726882442.25686: checking to see if all hosts have failed and the running result is not ok 13531 1726882442.25686: done checking to see if all hosts have failed 13531 1726882442.25687: getting the remaining hosts for this loop 13531 1726882442.25688: done getting the remaining hosts for this loop 13531 1726882442.25691: getting the next task for host managed_node2 13531 1726882442.25700: done getting next task for host managed_node2 13531 1726882442.25703: ^ task is: TASK: meta (role_complete) 13531 1726882442.25706: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882442.25717: getting variables 13531 1726882442.25719: in VariableManager get_vars() 13531 1726882442.25771: Calling all_inventory to load vars for managed_node2 13531 1726882442.25774: Calling groups_inventory to load vars for managed_node2 13531 1726882442.25776: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882442.25786: Calling all_plugins_play to load vars for managed_node2 13531 1726882442.25788: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882442.25791: Calling groups_plugins_play to load vars for managed_node2 13531 1726882442.27229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882442.28256: done with get_vars() 13531 1726882442.28279: done getting variables 13531 1726882442.28340: done queuing things up, now waiting for results queue to drain 13531 1726882442.28341: results queue empty 13531 1726882442.28342: checking for any_errors_fatal 13531 1726882442.28344: done checking for any_errors_fatal 13531 1726882442.28344: checking for max_fail_percentage 13531 1726882442.28345: done checking for max_fail_percentage 13531 1726882442.28346: checking to see if all hosts have failed and the running result is not ok 13531 1726882442.28346: done checking to see if all hosts have failed 13531 1726882442.28347: getting the remaining hosts for this loop 13531 1726882442.28347: done getting the remaining hosts for this loop 13531 1726882442.28349: getting the next task for host managed_node2 13531 1726882442.28352: done getting next task for host managed_node2 13531 1726882442.28354: ^ task is: TASK: From the active connection, get the port1 profile "{{ port1_profile }}" 13531 1726882442.28355: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882442.28358: getting variables 13531 1726882442.28358: in VariableManager get_vars() 13531 1726882442.28375: Calling all_inventory to load vars for managed_node2 13531 1726882442.28376: Calling groups_inventory to load vars for managed_node2 13531 1726882442.28377: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882442.28381: Calling all_plugins_play to load vars for managed_node2 13531 1726882442.28383: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882442.28384: Calling groups_plugins_play to load vars for managed_node2 13531 1726882442.29059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882442.31694: done with get_vars() 13531 1726882442.31712: done getting variables 13531 1726882442.31743: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882442.31840: variable 'port1_profile' from source: play vars TASK [From the active connection, get the port1 profile "bond0.0"] ************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:104 Friday 20 September 2024 21:34:02 -0400 (0:00:00.393) 0:00:30.213 ****** 13531 1726882442.31869: entering _queue_task() for managed_node2/command 13531 1726882442.32097: worker is 1 (out of 1 available) 13531 1726882442.32111: exiting _queue_task() for managed_node2/command 13531 1726882442.32124: done queuing things up, now waiting for results queue to drain 13531 1726882442.32126: waiting for pending results... 13531 1726882442.32317: running TaskExecutor() for managed_node2/TASK: From the active connection, get the port1 profile "bond0.0" 13531 1726882442.32387: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000c0 13531 1726882442.32397: variable 'ansible_search_path' from source: unknown 13531 1726882442.32428: calling self._execute() 13531 1726882442.32557: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882442.32566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882442.32579: variable 'omit' from source: magic vars 13531 1726882442.32848: variable 'ansible_distribution_major_version' from source: facts 13531 1726882442.32859: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882442.32941: variable 'network_provider' from source: set_fact 13531 1726882442.32945: Evaluated conditional (network_provider == "nm"): True 13531 1726882442.32951: variable 'omit' from source: magic vars 13531 1726882442.32973: variable 'omit' from source: magic vars 13531 1726882442.33041: variable 'port1_profile' from source: play vars 13531 1726882442.33053: variable 'omit' from source: magic vars 13531 1726882442.33095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882442.33123: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882442.33140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882442.33153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882442.33166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882442.33190: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882442.33195: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882442.33197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882442.33272: Set connection var ansible_pipelining to False 13531 1726882442.33275: Set connection var ansible_timeout to 10 13531 1726882442.33281: Set connection var ansible_shell_executable to /bin/sh 13531 1726882442.33286: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882442.33288: Set connection var ansible_connection to ssh 13531 1726882442.33290: Set connection var ansible_shell_type to sh 13531 1726882442.33311: variable 'ansible_shell_executable' from source: unknown 13531 1726882442.33314: variable 'ansible_connection' from source: unknown 13531 1726882442.33316: variable 'ansible_module_compression' from source: unknown 13531 1726882442.33318: variable 'ansible_shell_type' from source: unknown 13531 1726882442.33321: variable 'ansible_shell_executable' from source: unknown 13531 1726882442.33323: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882442.33325: variable 'ansible_pipelining' from source: unknown 13531 1726882442.33332: variable 'ansible_timeout' from source: unknown 13531 1726882442.33335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882442.33460: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882442.33473: variable 'omit' from source: magic vars 13531 1726882442.33515: starting attempt loop 13531 1726882442.33518: running the handler 13531 1726882442.33520: _low_level_execute_command(): starting 13531 1726882442.33522: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882442.34743: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882442.34747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.34749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.34751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.34753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.34756: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882442.34758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.34759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882442.34761: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882442.34765: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882442.34767: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.34769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.34772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.34773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.34775: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882442.34777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.35506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882442.35511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882442.35513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.35515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.36687: stdout chunk (state=3): >>>/root <<< 13531 1726882442.36869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882442.36873: stdout chunk (state=3): >>><<< 13531 1726882442.36875: stderr chunk (state=3): >>><<< 13531 1726882442.36999: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882442.37002: _low_level_execute_command(): starting 13531 1726882442.37006: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882442.369022-14865-235463832855862 `" && echo ansible-tmp-1726882442.369022-14865-235463832855862="` echo /root/.ansible/tmp/ansible-tmp-1726882442.369022-14865-235463832855862 `" ) && sleep 0' 13531 1726882442.37547: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882442.37569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.37584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.37602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.37642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.37658: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882442.37675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.37703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882442.37717: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882442.37759: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.37763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.37834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882442.37839: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882442.37842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.37929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.39852: stdout chunk (state=3): >>>ansible-tmp-1726882442.369022-14865-235463832855862=/root/.ansible/tmp/ansible-tmp-1726882442.369022-14865-235463832855862 <<< 13531 1726882442.39962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882442.40041: stderr chunk (state=3): >>><<< 13531 1726882442.40044: stdout chunk (state=3): >>><<< 13531 1726882442.40369: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882442.369022-14865-235463832855862=/root/.ansible/tmp/ansible-tmp-1726882442.369022-14865-235463832855862 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882442.40372: variable 'ansible_module_compression' from source: unknown 13531 1726882442.40375: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13531 1726882442.40377: variable 'ansible_facts' from source: unknown 13531 1726882442.40379: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882442.369022-14865-235463832855862/AnsiballZ_command.py 13531 1726882442.40777: Sending initial data 13531 1726882442.40780: Sent initial data (155 bytes) 13531 1726882442.41988: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882442.42003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.42018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.42036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.42083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.42095: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882442.42106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.42120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882442.42129: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882442.42141: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882442.42152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.42169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.42184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.42194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.42202: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882442.42213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.42284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882442.42305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882442.42323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.42450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.44252: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882442.44346: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882442.44441: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpf8ckewas /root/.ansible/tmp/ansible-tmp-1726882442.369022-14865-235463832855862/AnsiballZ_command.py <<< 13531 1726882442.44528: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882442.45802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882442.45885: stderr chunk (state=3): >>><<< 13531 1726882442.45889: stdout chunk (state=3): >>><<< 13531 1726882442.45909: done transferring module to remote 13531 1726882442.45922: _low_level_execute_command(): starting 13531 1726882442.45927: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882442.369022-14865-235463832855862/ /root/.ansible/tmp/ansible-tmp-1726882442.369022-14865-235463832855862/AnsiballZ_command.py && sleep 0' 13531 1726882442.46708: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882442.46717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.46727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.46745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.46784: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.46791: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882442.46801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.46814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882442.46821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882442.46828: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882442.46836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.46849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.46860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.46870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.46877: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882442.46886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.46968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882442.46985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882442.46997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.47132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.48935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882442.49009: stderr chunk (state=3): >>><<< 13531 1726882442.49015: stdout chunk (state=3): >>><<< 13531 1726882442.49037: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882442.49040: _low_level_execute_command(): starting 13531 1726882442.49045: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882442.369022-14865-235463832855862/AnsiballZ_command.py && sleep 0' 13531 1726882442.49690: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882442.49699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.49708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.49723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.49766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.49774: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882442.49784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.49796: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882442.49803: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882442.49810: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882442.49817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.49826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.49844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.49851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.49858: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882442.49868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.49940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882442.49965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882442.49977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.50107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.65190: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.0"], "start": "2024-09-20 21:34:02.630794", "end": "2024-09-20 21:34:02.649843", "delta": "0:00:00.019049", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882442.66484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882442.66488: stderr chunk (state=3): >>><<< 13531 1726882442.66492: stdout chunk (state=3): >>><<< 13531 1726882442.66510: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.0"], "start": "2024-09-20 21:34:02.630794", "end": "2024-09-20 21:34:02.649843", "delta": "0:00:00.019049", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882442.66541: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882442.369022-14865-235463832855862/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882442.66547: _low_level_execute_command(): starting 13531 1726882442.66552: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882442.369022-14865-235463832855862/ > /dev/null 2>&1 && sleep 0' 13531 1726882442.67010: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.67014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.67051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.67054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.67058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.67114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882442.67117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882442.67123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.67220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.69061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882442.69118: stderr chunk (state=3): >>><<< 13531 1726882442.69122: stdout chunk (state=3): >>><<< 13531 1726882442.69140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882442.69145: handler run complete 13531 1726882442.69167: Evaluated conditional (False): False 13531 1726882442.69177: attempt loop complete, returning result 13531 1726882442.69179: _execute() done 13531 1726882442.69182: dumping result to json 13531 1726882442.69186: done dumping result, returning 13531 1726882442.69194: done running TaskExecutor() for managed_node2/TASK: From the active connection, get the port1 profile "bond0.0" [0e448fcc-3ce9-4fd9-519d-0000000000c0] 13531 1726882442.69199: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c0 13531 1726882442.69306: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c0 13531 1726882442.69310: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0.0" ], "delta": "0:00:00.019049", "end": "2024-09-20 21:34:02.649843", "rc": 0, "start": "2024-09-20 21:34:02.630794" } 13531 1726882442.69391: no more pending results, returning what we have 13531 1726882442.69395: results queue empty 13531 1726882442.69396: checking for any_errors_fatal 13531 1726882442.69398: done checking for any_errors_fatal 13531 1726882442.69399: checking for max_fail_percentage 13531 1726882442.69402: done checking for max_fail_percentage 13531 1726882442.69403: checking to see if all hosts have failed and the running result is not ok 13531 1726882442.69403: done checking to see if all hosts have failed 13531 1726882442.69404: getting the remaining hosts for this loop 13531 1726882442.69405: done getting the remaining hosts for this loop 13531 1726882442.69409: getting the next task for host managed_node2 13531 1726882442.69415: done getting next task for host managed_node2 13531 1726882442.69418: ^ task is: TASK: From the active connection, get the port2 profile "{{ port2_profile }}" 13531 1726882442.69422: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882442.69426: getting variables 13531 1726882442.69428: in VariableManager get_vars() 13531 1726882442.69490: Calling all_inventory to load vars for managed_node2 13531 1726882442.69493: Calling groups_inventory to load vars for managed_node2 13531 1726882442.69495: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882442.69506: Calling all_plugins_play to load vars for managed_node2 13531 1726882442.69509: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882442.69511: Calling groups_plugins_play to load vars for managed_node2 13531 1726882442.70418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882442.72357: done with get_vars() 13531 1726882442.72382: done getting variables 13531 1726882442.72430: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882442.72534: variable 'port2_profile' from source: play vars TASK [From the active connection, get the port2 profile "bond0.1"] ************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:111 Friday 20 September 2024 21:34:02 -0400 (0:00:00.406) 0:00:30.620 ****** 13531 1726882442.72562: entering _queue_task() for managed_node2/command 13531 1726882442.72804: worker is 1 (out of 1 available) 13531 1726882442.72818: exiting _queue_task() for managed_node2/command 13531 1726882442.72830: done queuing things up, now waiting for results queue to drain 13531 1726882442.72831: waiting for pending results... 13531 1726882442.73020: running TaskExecutor() for managed_node2/TASK: From the active connection, get the port2 profile "bond0.1" 13531 1726882442.73088: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000c1 13531 1726882442.73100: variable 'ansible_search_path' from source: unknown 13531 1726882442.73131: calling self._execute() 13531 1726882442.73213: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882442.73217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882442.73226: variable 'omit' from source: magic vars 13531 1726882442.73491: variable 'ansible_distribution_major_version' from source: facts 13531 1726882442.73507: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882442.73586: variable 'network_provider' from source: set_fact 13531 1726882442.73590: Evaluated conditional (network_provider == "nm"): True 13531 1726882442.73595: variable 'omit' from source: magic vars 13531 1726882442.73618: variable 'omit' from source: magic vars 13531 1726882442.73687: variable 'port2_profile' from source: play vars 13531 1726882442.73701: variable 'omit' from source: magic vars 13531 1726882442.73739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882442.73767: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882442.73784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882442.73796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882442.73805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882442.73832: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882442.73835: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882442.73837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882442.73908: Set connection var ansible_pipelining to False 13531 1726882442.73911: Set connection var ansible_timeout to 10 13531 1726882442.73917: Set connection var ansible_shell_executable to /bin/sh 13531 1726882442.73921: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882442.73923: Set connection var ansible_connection to ssh 13531 1726882442.73926: Set connection var ansible_shell_type to sh 13531 1726882442.73947: variable 'ansible_shell_executable' from source: unknown 13531 1726882442.73950: variable 'ansible_connection' from source: unknown 13531 1726882442.73953: variable 'ansible_module_compression' from source: unknown 13531 1726882442.73957: variable 'ansible_shell_type' from source: unknown 13531 1726882442.73960: variable 'ansible_shell_executable' from source: unknown 13531 1726882442.73962: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882442.73966: variable 'ansible_pipelining' from source: unknown 13531 1726882442.73968: variable 'ansible_timeout' from source: unknown 13531 1726882442.73971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882442.74078: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882442.74097: variable 'omit' from source: magic vars 13531 1726882442.74109: starting attempt loop 13531 1726882442.74115: running the handler 13531 1726882442.74127: _low_level_execute_command(): starting 13531 1726882442.74134: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882442.75083: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.75160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.76821: stdout chunk (state=3): >>>/root <<< 13531 1726882442.76930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882442.76978: stderr chunk (state=3): >>><<< 13531 1726882442.76984: stdout chunk (state=3): >>><<< 13531 1726882442.77002: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882442.77015: _low_level_execute_command(): starting 13531 1726882442.77027: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882442.7700384-14884-218403328853527 `" && echo ansible-tmp-1726882442.7700384-14884-218403328853527="` echo /root/.ansible/tmp/ansible-tmp-1726882442.7700384-14884-218403328853527 `" ) && sleep 0' 13531 1726882442.77491: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.77497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.77526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.77533: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882442.77541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.77551: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882442.77566: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882442.77579: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882442.77582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.77590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.77600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.77607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882442.77613: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882442.77620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.77678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882442.77689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882442.77695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.77812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.79763: stdout chunk (state=3): >>>ansible-tmp-1726882442.7700384-14884-218403328853527=/root/.ansible/tmp/ansible-tmp-1726882442.7700384-14884-218403328853527 <<< 13531 1726882442.79876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882442.79929: stderr chunk (state=3): >>><<< 13531 1726882442.79933: stdout chunk (state=3): >>><<< 13531 1726882442.79949: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882442.7700384-14884-218403328853527=/root/.ansible/tmp/ansible-tmp-1726882442.7700384-14884-218403328853527 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882442.79983: variable 'ansible_module_compression' from source: unknown 13531 1726882442.80023: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13531 1726882442.80053: variable 'ansible_facts' from source: unknown 13531 1726882442.80117: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882442.7700384-14884-218403328853527/AnsiballZ_command.py 13531 1726882442.80231: Sending initial data 13531 1726882442.80234: Sent initial data (156 bytes) 13531 1726882442.80918: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.80922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.80969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882442.80973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.80985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.81024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882442.81036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.81147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.82917: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882442.83010: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882442.83106: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpd3ob4qfy /root/.ansible/tmp/ansible-tmp-1726882442.7700384-14884-218403328853527/AnsiballZ_command.py <<< 13531 1726882442.83205: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882442.84234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882442.84346: stderr chunk (state=3): >>><<< 13531 1726882442.84350: stdout chunk (state=3): >>><<< 13531 1726882442.84369: done transferring module to remote 13531 1726882442.84381: _low_level_execute_command(): starting 13531 1726882442.84384: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882442.7700384-14884-218403328853527/ /root/.ansible/tmp/ansible-tmp-1726882442.7700384-14884-218403328853527/AnsiballZ_command.py && sleep 0' 13531 1726882442.84847: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882442.84851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.84886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.84900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.84957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882442.84971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.85086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882442.86845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882442.86916: stderr chunk (state=3): >>><<< 13531 1726882442.86919: stdout chunk (state=3): >>><<< 13531 1726882442.86933: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882442.86936: _low_level_execute_command(): starting 13531 1726882442.86941: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882442.7700384-14884-218403328853527/AnsiballZ_command.py && sleep 0' 13531 1726882442.87423: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.87427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882442.87466: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.87469: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882442.87472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882442.87516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882442.87527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882442.87645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882443.02802: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.1"], "start": "2024-09-20 21:34:03.006879", "end": "2024-09-20 21:34:03.026091", "delta": "0:00:00.019212", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882443.04171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882443.04175: stdout chunk (state=3): >>><<< 13531 1726882443.04177: stderr chunk (state=3): >>><<< 13531 1726882443.04321: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.1"], "start": "2024-09-20 21:34:03.006879", "end": "2024-09-20 21:34:03.026091", "delta": "0:00:00.019212", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882443.04327: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882442.7700384-14884-218403328853527/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882443.04330: _low_level_execute_command(): starting 13531 1726882443.04332: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882442.7700384-14884-218403328853527/ > /dev/null 2>&1 && sleep 0' 13531 1726882443.05007: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882443.05028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882443.05043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882443.05072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882443.05123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882443.05139: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882443.05157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882443.05180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882443.05193: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882443.05211: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882443.05224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882443.05246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882443.05269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882443.05283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882443.05294: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882443.05309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882443.05394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882443.05418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882443.05435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882443.05640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882443.07444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882443.07542: stderr chunk (state=3): >>><<< 13531 1726882443.07553: stdout chunk (state=3): >>><<< 13531 1726882443.07770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882443.07774: handler run complete 13531 1726882443.07777: Evaluated conditional (False): False 13531 1726882443.07779: attempt loop complete, returning result 13531 1726882443.07781: _execute() done 13531 1726882443.07783: dumping result to json 13531 1726882443.07785: done dumping result, returning 13531 1726882443.07787: done running TaskExecutor() for managed_node2/TASK: From the active connection, get the port2 profile "bond0.1" [0e448fcc-3ce9-4fd9-519d-0000000000c1] 13531 1726882443.07789: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c1 13531 1726882443.07877: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c1 13531 1726882443.07880: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0.1" ], "delta": "0:00:00.019212", "end": "2024-09-20 21:34:03.026091", "rc": 0, "start": "2024-09-20 21:34:03.006879" } 13531 1726882443.07968: no more pending results, returning what we have 13531 1726882443.07972: results queue empty 13531 1726882443.07973: checking for any_errors_fatal 13531 1726882443.07986: done checking for any_errors_fatal 13531 1726882443.07987: checking for max_fail_percentage 13531 1726882443.07990: done checking for max_fail_percentage 13531 1726882443.07991: checking to see if all hosts have failed and the running result is not ok 13531 1726882443.07992: done checking to see if all hosts have failed 13531 1726882443.07992: getting the remaining hosts for this loop 13531 1726882443.07994: done getting the remaining hosts for this loop 13531 1726882443.07998: getting the next task for host managed_node2 13531 1726882443.08005: done getting next task for host managed_node2 13531 1726882443.08007: ^ task is: TASK: Assert that the port1 profile is not activated 13531 1726882443.08010: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882443.08014: getting variables 13531 1726882443.08016: in VariableManager get_vars() 13531 1726882443.08088: Calling all_inventory to load vars for managed_node2 13531 1726882443.08091: Calling groups_inventory to load vars for managed_node2 13531 1726882443.08094: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882443.08107: Calling all_plugins_play to load vars for managed_node2 13531 1726882443.08111: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882443.08114: Calling groups_plugins_play to load vars for managed_node2 13531 1726882443.10050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882443.11947: done with get_vars() 13531 1726882443.11994: done getting variables 13531 1726882443.12072: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port1 profile is not activated] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:118 Friday 20 September 2024 21:34:03 -0400 (0:00:00.395) 0:00:31.016 ****** 13531 1726882443.12105: entering _queue_task() for managed_node2/assert 13531 1726882443.12451: worker is 1 (out of 1 available) 13531 1726882443.12468: exiting _queue_task() for managed_node2/assert 13531 1726882443.12481: done queuing things up, now waiting for results queue to drain 13531 1726882443.12482: waiting for pending results... 13531 1726882443.12804: running TaskExecutor() for managed_node2/TASK: Assert that the port1 profile is not activated 13531 1726882443.12943: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000c2 13531 1726882443.12970: variable 'ansible_search_path' from source: unknown 13531 1726882443.13018: calling self._execute() 13531 1726882443.13143: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882443.13163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882443.13181: variable 'omit' from source: magic vars 13531 1726882443.13617: variable 'ansible_distribution_major_version' from source: facts 13531 1726882443.13636: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882443.13772: variable 'network_provider' from source: set_fact 13531 1726882443.13784: Evaluated conditional (network_provider == "nm"): True 13531 1726882443.13801: variable 'omit' from source: magic vars 13531 1726882443.13834: variable 'omit' from source: magic vars 13531 1726882443.13948: variable 'port1_profile' from source: play vars 13531 1726882443.13976: variable 'omit' from source: magic vars 13531 1726882443.14040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882443.14085: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882443.14112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882443.14144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882443.14165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882443.14202: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882443.14210: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882443.14218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882443.14337: Set connection var ansible_pipelining to False 13531 1726882443.14357: Set connection var ansible_timeout to 10 13531 1726882443.14369: Set connection var ansible_shell_executable to /bin/sh 13531 1726882443.14378: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882443.14384: Set connection var ansible_connection to ssh 13531 1726882443.14390: Set connection var ansible_shell_type to sh 13531 1726882443.14419: variable 'ansible_shell_executable' from source: unknown 13531 1726882443.14426: variable 'ansible_connection' from source: unknown 13531 1726882443.14432: variable 'ansible_module_compression' from source: unknown 13531 1726882443.14438: variable 'ansible_shell_type' from source: unknown 13531 1726882443.14444: variable 'ansible_shell_executable' from source: unknown 13531 1726882443.14456: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882443.14470: variable 'ansible_pipelining' from source: unknown 13531 1726882443.14477: variable 'ansible_timeout' from source: unknown 13531 1726882443.14485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882443.14631: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882443.14648: variable 'omit' from source: magic vars 13531 1726882443.14661: starting attempt loop 13531 1726882443.14673: running the handler 13531 1726882443.14850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882443.17433: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882443.17516: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882443.17559: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882443.17607: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882443.17650: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882443.17739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882443.17777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882443.17810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882443.17865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882443.17887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882443.18002: variable 'active_port1_profile' from source: set_fact 13531 1726882443.18023: Evaluated conditional (active_port1_profile.stdout | length == 0): True 13531 1726882443.18037: handler run complete 13531 1726882443.18061: attempt loop complete, returning result 13531 1726882443.18070: _execute() done 13531 1726882443.18075: dumping result to json 13531 1726882443.18081: done dumping result, returning 13531 1726882443.18091: done running TaskExecutor() for managed_node2/TASK: Assert that the port1 profile is not activated [0e448fcc-3ce9-4fd9-519d-0000000000c2] 13531 1726882443.18100: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c2 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882443.18270: no more pending results, returning what we have 13531 1726882443.18274: results queue empty 13531 1726882443.18275: checking for any_errors_fatal 13531 1726882443.18285: done checking for any_errors_fatal 13531 1726882443.18285: checking for max_fail_percentage 13531 1726882443.18288: done checking for max_fail_percentage 13531 1726882443.18289: checking to see if all hosts have failed and the running result is not ok 13531 1726882443.18290: done checking to see if all hosts have failed 13531 1726882443.18291: getting the remaining hosts for this loop 13531 1726882443.18292: done getting the remaining hosts for this loop 13531 1726882443.18297: getting the next task for host managed_node2 13531 1726882443.18303: done getting next task for host managed_node2 13531 1726882443.18306: ^ task is: TASK: Assert that the port2 profile is not activated 13531 1726882443.18308: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882443.18312: getting variables 13531 1726882443.18314: in VariableManager get_vars() 13531 1726882443.18386: Calling all_inventory to load vars for managed_node2 13531 1726882443.18397: Calling groups_inventory to load vars for managed_node2 13531 1726882443.18399: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882443.18413: Calling all_plugins_play to load vars for managed_node2 13531 1726882443.18416: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882443.18419: Calling groups_plugins_play to load vars for managed_node2 13531 1726882443.19410: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c2 13531 1726882443.19414: WORKER PROCESS EXITING 13531 1726882443.20388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882443.23330: done with get_vars() 13531 1726882443.23589: done getting variables 13531 1726882443.23651: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port2 profile is not activated] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:125 Friday 20 September 2024 21:34:03 -0400 (0:00:00.115) 0:00:31.132 ****** 13531 1726882443.23685: entering _queue_task() for managed_node2/assert 13531 1726882443.24408: worker is 1 (out of 1 available) 13531 1726882443.24421: exiting _queue_task() for managed_node2/assert 13531 1726882443.24433: done queuing things up, now waiting for results queue to drain 13531 1726882443.24434: waiting for pending results... 13531 1726882443.24989: running TaskExecutor() for managed_node2/TASK: Assert that the port2 profile is not activated 13531 1726882443.25110: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000c3 13531 1726882443.25133: variable 'ansible_search_path' from source: unknown 13531 1726882443.25182: calling self._execute() 13531 1726882443.25293: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882443.25306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882443.25320: variable 'omit' from source: magic vars 13531 1726882443.25739: variable 'ansible_distribution_major_version' from source: facts 13531 1726882443.25765: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882443.25909: variable 'network_provider' from source: set_fact 13531 1726882443.25922: Evaluated conditional (network_provider == "nm"): True 13531 1726882443.25934: variable 'omit' from source: magic vars 13531 1726882443.25969: variable 'omit' from source: magic vars 13531 1726882443.26080: variable 'port2_profile' from source: play vars 13531 1726882443.26109: variable 'omit' from source: magic vars 13531 1726882443.26159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882443.26202: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882443.26232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882443.26252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882443.26272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882443.26305: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882443.26315: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882443.26326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882443.26436: Set connection var ansible_pipelining to False 13531 1726882443.26446: Set connection var ansible_timeout to 10 13531 1726882443.26458: Set connection var ansible_shell_executable to /bin/sh 13531 1726882443.26470: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882443.26476: Set connection var ansible_connection to ssh 13531 1726882443.26481: Set connection var ansible_shell_type to sh 13531 1726882443.26511: variable 'ansible_shell_executable' from source: unknown 13531 1726882443.26517: variable 'ansible_connection' from source: unknown 13531 1726882443.26523: variable 'ansible_module_compression' from source: unknown 13531 1726882443.26529: variable 'ansible_shell_type' from source: unknown 13531 1726882443.26541: variable 'ansible_shell_executable' from source: unknown 13531 1726882443.26547: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882443.26556: variable 'ansible_pipelining' from source: unknown 13531 1726882443.26562: variable 'ansible_timeout' from source: unknown 13531 1726882443.26572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882443.26715: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882443.26731: variable 'omit' from source: magic vars 13531 1726882443.26740: starting attempt loop 13531 1726882443.26746: running the handler 13531 1726882443.26927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882443.29680: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882443.29762: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882443.29814: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882443.29856: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882443.29890: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882443.29977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882443.30011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882443.30049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882443.30151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882443.30177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882443.30299: variable 'active_port2_profile' from source: set_fact 13531 1726882443.30325: Evaluated conditional (active_port2_profile.stdout | length == 0): True 13531 1726882443.30337: handler run complete 13531 1726882443.30369: attempt loop complete, returning result 13531 1726882443.30377: _execute() done 13531 1726882443.30383: dumping result to json 13531 1726882443.30390: done dumping result, returning 13531 1726882443.30403: done running TaskExecutor() for managed_node2/TASK: Assert that the port2 profile is not activated [0e448fcc-3ce9-4fd9-519d-0000000000c3] 13531 1726882443.30414: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c3 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882443.30575: no more pending results, returning what we have 13531 1726882443.30578: results queue empty 13531 1726882443.30579: checking for any_errors_fatal 13531 1726882443.30588: done checking for any_errors_fatal 13531 1726882443.30588: checking for max_fail_percentage 13531 1726882443.30590: done checking for max_fail_percentage 13531 1726882443.30591: checking to see if all hosts have failed and the running result is not ok 13531 1726882443.30592: done checking to see if all hosts have failed 13531 1726882443.30593: getting the remaining hosts for this loop 13531 1726882443.30594: done getting the remaining hosts for this loop 13531 1726882443.30599: getting the next task for host managed_node2 13531 1726882443.30605: done getting next task for host managed_node2 13531 1726882443.30608: ^ task is: TASK: Get the port1 device state 13531 1726882443.30610: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882443.30613: getting variables 13531 1726882443.30615: in VariableManager get_vars() 13531 1726882443.30684: Calling all_inventory to load vars for managed_node2 13531 1726882443.30687: Calling groups_inventory to load vars for managed_node2 13531 1726882443.30697: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882443.30710: Calling all_plugins_play to load vars for managed_node2 13531 1726882443.30713: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882443.30717: Calling groups_plugins_play to load vars for managed_node2 13531 1726882443.31688: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c3 13531 1726882443.31691: WORKER PROCESS EXITING 13531 1726882443.32684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882443.34869: done with get_vars() 13531 1726882443.34903: done getting variables 13531 1726882443.35086: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the port1 device state] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:132 Friday 20 September 2024 21:34:03 -0400 (0:00:00.114) 0:00:31.246 ****** 13531 1726882443.35116: entering _queue_task() for managed_node2/command 13531 1726882443.35797: worker is 1 (out of 1 available) 13531 1726882443.35873: exiting _queue_task() for managed_node2/command 13531 1726882443.35887: done queuing things up, now waiting for results queue to drain 13531 1726882443.35888: waiting for pending results... 13531 1726882443.36520: running TaskExecutor() for managed_node2/TASK: Get the port1 device state 13531 1726882443.36636: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000c4 13531 1726882443.36658: variable 'ansible_search_path' from source: unknown 13531 1726882443.36712: calling self._execute() 13531 1726882443.36824: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882443.36835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882443.36849: variable 'omit' from source: magic vars 13531 1726882443.37251: variable 'ansible_distribution_major_version' from source: facts 13531 1726882443.37275: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882443.37404: variable 'network_provider' from source: set_fact 13531 1726882443.37414: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882443.37421: when evaluation is False, skipping this task 13531 1726882443.37428: _execute() done 13531 1726882443.37433: dumping result to json 13531 1726882443.37440: done dumping result, returning 13531 1726882443.37459: done running TaskExecutor() for managed_node2/TASK: Get the port1 device state [0e448fcc-3ce9-4fd9-519d-0000000000c4] 13531 1726882443.37472: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c4 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13531 1726882443.37629: no more pending results, returning what we have 13531 1726882443.37634: results queue empty 13531 1726882443.37635: checking for any_errors_fatal 13531 1726882443.37644: done checking for any_errors_fatal 13531 1726882443.37645: checking for max_fail_percentage 13531 1726882443.37647: done checking for max_fail_percentage 13531 1726882443.37648: checking to see if all hosts have failed and the running result is not ok 13531 1726882443.37649: done checking to see if all hosts have failed 13531 1726882443.37650: getting the remaining hosts for this loop 13531 1726882443.37652: done getting the remaining hosts for this loop 13531 1726882443.37658: getting the next task for host managed_node2 13531 1726882443.37667: done getting next task for host managed_node2 13531 1726882443.37671: ^ task is: TASK: Get the port2 device state 13531 1726882443.37675: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882443.37681: getting variables 13531 1726882443.37683: in VariableManager get_vars() 13531 1726882443.37743: Calling all_inventory to load vars for managed_node2 13531 1726882443.37746: Calling groups_inventory to load vars for managed_node2 13531 1726882443.37749: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882443.37768: Calling all_plugins_play to load vars for managed_node2 13531 1726882443.37771: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882443.37774: Calling groups_plugins_play to load vars for managed_node2 13531 1726882443.40676: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c4 13531 1726882443.40681: WORKER PROCESS EXITING 13531 1726882443.42176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882443.46171: done with get_vars() 13531 1726882443.46203: done getting variables 13531 1726882443.46268: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the port2 device state] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:139 Friday 20 September 2024 21:34:03 -0400 (0:00:00.111) 0:00:31.358 ****** 13531 1726882443.46297: entering _queue_task() for managed_node2/command 13531 1726882443.46829: worker is 1 (out of 1 available) 13531 1726882443.46842: exiting _queue_task() for managed_node2/command 13531 1726882443.46858: done queuing things up, now waiting for results queue to drain 13531 1726882443.46860: waiting for pending results... 13531 1726882443.47320: running TaskExecutor() for managed_node2/TASK: Get the port2 device state 13531 1726882443.47430: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000c5 13531 1726882443.47451: variable 'ansible_search_path' from source: unknown 13531 1726882443.47503: calling self._execute() 13531 1726882443.47615: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882443.47626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882443.47638: variable 'omit' from source: magic vars 13531 1726882443.48049: variable 'ansible_distribution_major_version' from source: facts 13531 1726882443.48071: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882443.48195: variable 'network_provider' from source: set_fact 13531 1726882443.48207: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882443.48215: when evaluation is False, skipping this task 13531 1726882443.48230: _execute() done 13531 1726882443.48237: dumping result to json 13531 1726882443.48245: done dumping result, returning 13531 1726882443.48259: done running TaskExecutor() for managed_node2/TASK: Get the port2 device state [0e448fcc-3ce9-4fd9-519d-0000000000c5] 13531 1726882443.48273: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c5 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13531 1726882443.48427: no more pending results, returning what we have 13531 1726882443.48432: results queue empty 13531 1726882443.48433: checking for any_errors_fatal 13531 1726882443.48442: done checking for any_errors_fatal 13531 1726882443.48443: checking for max_fail_percentage 13531 1726882443.48445: done checking for max_fail_percentage 13531 1726882443.48446: checking to see if all hosts have failed and the running result is not ok 13531 1726882443.48447: done checking to see if all hosts have failed 13531 1726882443.48448: getting the remaining hosts for this loop 13531 1726882443.48449: done getting the remaining hosts for this loop 13531 1726882443.48453: getting the next task for host managed_node2 13531 1726882443.48463: done getting next task for host managed_node2 13531 1726882443.48468: ^ task is: TASK: Assert that the port1 device is in DOWN state 13531 1726882443.48472: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882443.48475: getting variables 13531 1726882443.48477: in VariableManager get_vars() 13531 1726882443.48546: Calling all_inventory to load vars for managed_node2 13531 1726882443.48549: Calling groups_inventory to load vars for managed_node2 13531 1726882443.48552: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882443.48572: Calling all_plugins_play to load vars for managed_node2 13531 1726882443.48576: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882443.48579: Calling groups_plugins_play to load vars for managed_node2 13531 1726882443.49616: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c5 13531 1726882443.49620: WORKER PROCESS EXITING 13531 1726882443.51908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882443.55478: done with get_vars() 13531 1726882443.55513: done getting variables 13531 1726882443.55582: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port1 device is in DOWN state] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:146 Friday 20 September 2024 21:34:03 -0400 (0:00:00.093) 0:00:31.451 ****** 13531 1726882443.55613: entering _queue_task() for managed_node2/assert 13531 1726882443.55946: worker is 1 (out of 1 available) 13531 1726882443.55962: exiting _queue_task() for managed_node2/assert 13531 1726882443.55978: done queuing things up, now waiting for results queue to drain 13531 1726882443.55979: waiting for pending results... 13531 1726882443.57771: running TaskExecutor() for managed_node2/TASK: Assert that the port1 device is in DOWN state 13531 1726882443.58359: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000c6 13531 1726882443.58370: variable 'ansible_search_path' from source: unknown 13531 1726882443.58410: calling self._execute() 13531 1726882443.58514: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882443.58518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882443.58528: variable 'omit' from source: magic vars 13531 1726882443.59201: variable 'ansible_distribution_major_version' from source: facts 13531 1726882443.59209: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882443.59669: variable 'network_provider' from source: set_fact 13531 1726882443.59673: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882443.59675: when evaluation is False, skipping this task 13531 1726882443.59677: _execute() done 13531 1726882443.59679: dumping result to json 13531 1726882443.59680: done dumping result, returning 13531 1726882443.59682: done running TaskExecutor() for managed_node2/TASK: Assert that the port1 device is in DOWN state [0e448fcc-3ce9-4fd9-519d-0000000000c6] 13531 1726882443.59684: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c6 13531 1726882443.59748: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c6 13531 1726882443.59751: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13531 1726882443.59792: no more pending results, returning what we have 13531 1726882443.59795: results queue empty 13531 1726882443.59796: checking for any_errors_fatal 13531 1726882443.59801: done checking for any_errors_fatal 13531 1726882443.59802: checking for max_fail_percentage 13531 1726882443.59804: done checking for max_fail_percentage 13531 1726882443.59805: checking to see if all hosts have failed and the running result is not ok 13531 1726882443.59805: done checking to see if all hosts have failed 13531 1726882443.59806: getting the remaining hosts for this loop 13531 1726882443.59807: done getting the remaining hosts for this loop 13531 1726882443.59811: getting the next task for host managed_node2 13531 1726882443.59816: done getting next task for host managed_node2 13531 1726882443.59819: ^ task is: TASK: Assert that the port2 device is in DOWN state 13531 1726882443.59822: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882443.59825: getting variables 13531 1726882443.59827: in VariableManager get_vars() 13531 1726882443.59880: Calling all_inventory to load vars for managed_node2 13531 1726882443.59882: Calling groups_inventory to load vars for managed_node2 13531 1726882443.59885: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882443.59895: Calling all_plugins_play to load vars for managed_node2 13531 1726882443.59897: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882443.59900: Calling groups_plugins_play to load vars for managed_node2 13531 1726882443.74505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882443.76260: done with get_vars() 13531 1726882443.76297: done getting variables 13531 1726882443.76353: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port2 device is in DOWN state] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:153 Friday 20 September 2024 21:34:03 -0400 (0:00:00.207) 0:00:31.659 ****** 13531 1726882443.76384: entering _queue_task() for managed_node2/assert 13531 1726882443.76712: worker is 1 (out of 1 available) 13531 1726882443.76724: exiting _queue_task() for managed_node2/assert 13531 1726882443.76738: done queuing things up, now waiting for results queue to drain 13531 1726882443.76739: waiting for pending results... 13531 1726882443.77030: running TaskExecutor() for managed_node2/TASK: Assert that the port2 device is in DOWN state 13531 1726882443.77151: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000c7 13531 1726882443.77180: variable 'ansible_search_path' from source: unknown 13531 1726882443.77231: calling self._execute() 13531 1726882443.77347: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882443.77366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882443.77381: variable 'omit' from source: magic vars 13531 1726882443.77816: variable 'ansible_distribution_major_version' from source: facts 13531 1726882443.77840: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882443.77967: variable 'network_provider' from source: set_fact 13531 1726882443.77978: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882443.77985: when evaluation is False, skipping this task 13531 1726882443.77992: _execute() done 13531 1726882443.77998: dumping result to json 13531 1726882443.78004: done dumping result, returning 13531 1726882443.78015: done running TaskExecutor() for managed_node2/TASK: Assert that the port2 device is in DOWN state [0e448fcc-3ce9-4fd9-519d-0000000000c7] 13531 1726882443.78024: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c7 13531 1726882443.78151: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000c7 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13531 1726882443.78208: no more pending results, returning what we have 13531 1726882443.78212: results queue empty 13531 1726882443.78213: checking for any_errors_fatal 13531 1726882443.78226: done checking for any_errors_fatal 13531 1726882443.78227: checking for max_fail_percentage 13531 1726882443.78229: done checking for max_fail_percentage 13531 1726882443.78230: checking to see if all hosts have failed and the running result is not ok 13531 1726882443.78231: done checking to see if all hosts have failed 13531 1726882443.78231: getting the remaining hosts for this loop 13531 1726882443.78233: done getting the remaining hosts for this loop 13531 1726882443.78237: getting the next task for host managed_node2 13531 1726882443.78245: done getting next task for host managed_node2 13531 1726882443.78251: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13531 1726882443.78257: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882443.78282: getting variables 13531 1726882443.78284: in VariableManager get_vars() 13531 1726882443.78344: Calling all_inventory to load vars for managed_node2 13531 1726882443.78347: Calling groups_inventory to load vars for managed_node2 13531 1726882443.78349: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882443.78366: Calling all_plugins_play to load vars for managed_node2 13531 1726882443.78370: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882443.78373: Calling groups_plugins_play to load vars for managed_node2 13531 1726882443.79588: WORKER PROCESS EXITING 13531 1726882443.80101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882443.83449: done with get_vars() 13531 1726882443.83491: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:34:03 -0400 (0:00:00.072) 0:00:31.731 ****** 13531 1726882443.83606: entering _queue_task() for managed_node2/include_tasks 13531 1726882443.83943: worker is 1 (out of 1 available) 13531 1726882443.83957: exiting _queue_task() for managed_node2/include_tasks 13531 1726882443.83971: done queuing things up, now waiting for results queue to drain 13531 1726882443.83973: waiting for pending results... 13531 1726882443.84289: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13531 1726882443.84465: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000cf 13531 1726882443.84486: variable 'ansible_search_path' from source: unknown 13531 1726882443.84494: variable 'ansible_search_path' from source: unknown 13531 1726882443.84538: calling self._execute() 13531 1726882443.84646: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882443.84659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882443.84678: variable 'omit' from source: magic vars 13531 1726882443.85075: variable 'ansible_distribution_major_version' from source: facts 13531 1726882443.85092: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882443.85102: _execute() done 13531 1726882443.85109: dumping result to json 13531 1726882443.85116: done dumping result, returning 13531 1726882443.85126: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4fd9-519d-0000000000cf] 13531 1726882443.85137: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000cf 13531 1726882443.85289: no more pending results, returning what we have 13531 1726882443.85294: in VariableManager get_vars() 13531 1726882443.85360: Calling all_inventory to load vars for managed_node2 13531 1726882443.85365: Calling groups_inventory to load vars for managed_node2 13531 1726882443.85367: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882443.85381: Calling all_plugins_play to load vars for managed_node2 13531 1726882443.85384: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882443.85387: Calling groups_plugins_play to load vars for managed_node2 13531 1726882443.86568: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000cf 13531 1726882443.86572: WORKER PROCESS EXITING 13531 1726882443.87326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882443.89029: done with get_vars() 13531 1726882443.89052: variable 'ansible_search_path' from source: unknown 13531 1726882443.89056: variable 'ansible_search_path' from source: unknown 13531 1726882443.89101: we have included files to process 13531 1726882443.89102: generating all_blocks data 13531 1726882443.89106: done generating all_blocks data 13531 1726882443.89113: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882443.89114: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882443.89116: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882443.90428: done processing included file 13531 1726882443.90431: iterating over new_blocks loaded from include file 13531 1726882443.90432: in VariableManager get_vars() 13531 1726882443.90473: done with get_vars() 13531 1726882443.90475: filtering new block on tags 13531 1726882443.90496: done filtering new block on tags 13531 1726882443.90499: in VariableManager get_vars() 13531 1726882443.90532: done with get_vars() 13531 1726882443.90534: filtering new block on tags 13531 1726882443.90558: done filtering new block on tags 13531 1726882443.90561: in VariableManager get_vars() 13531 1726882443.90596: done with get_vars() 13531 1726882443.90598: filtering new block on tags 13531 1726882443.90616: done filtering new block on tags 13531 1726882443.90618: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 13531 1726882443.90623: extending task lists for all hosts with included blocks 13531 1726882443.92366: done extending task lists 13531 1726882443.92368: done processing included files 13531 1726882443.92369: results queue empty 13531 1726882443.92369: checking for any_errors_fatal 13531 1726882443.92373: done checking for any_errors_fatal 13531 1726882443.92374: checking for max_fail_percentage 13531 1726882443.92375: done checking for max_fail_percentage 13531 1726882443.92376: checking to see if all hosts have failed and the running result is not ok 13531 1726882443.92377: done checking to see if all hosts have failed 13531 1726882443.92377: getting the remaining hosts for this loop 13531 1726882443.92379: done getting the remaining hosts for this loop 13531 1726882443.92382: getting the next task for host managed_node2 13531 1726882443.92386: done getting next task for host managed_node2 13531 1726882443.92389: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13531 1726882443.92392: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882443.92405: getting variables 13531 1726882443.92406: in VariableManager get_vars() 13531 1726882443.92432: Calling all_inventory to load vars for managed_node2 13531 1726882443.92434: Calling groups_inventory to load vars for managed_node2 13531 1726882443.92436: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882443.92442: Calling all_plugins_play to load vars for managed_node2 13531 1726882443.92445: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882443.92448: Calling groups_plugins_play to load vars for managed_node2 13531 1726882443.93816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882443.95582: done with get_vars() 13531 1726882443.95611: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:34:03 -0400 (0:00:00.120) 0:00:31.852 ****** 13531 1726882443.95696: entering _queue_task() for managed_node2/setup 13531 1726882443.96036: worker is 1 (out of 1 available) 13531 1726882443.96048: exiting _queue_task() for managed_node2/setup 13531 1726882443.96062: done queuing things up, now waiting for results queue to drain 13531 1726882443.96064: waiting for pending results... 13531 1726882443.96347: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13531 1726882443.96517: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000796 13531 1726882443.96536: variable 'ansible_search_path' from source: unknown 13531 1726882443.96542: variable 'ansible_search_path' from source: unknown 13531 1726882443.96587: calling self._execute() 13531 1726882443.96691: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882443.96701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882443.96713: variable 'omit' from source: magic vars 13531 1726882443.97102: variable 'ansible_distribution_major_version' from source: facts 13531 1726882443.97119: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882443.97345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882443.99939: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882444.00146: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882444.00194: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882444.00239: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882444.00343: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882444.00494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882444.00566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882444.00673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882444.00720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882444.00878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882444.00935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882444.01078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882444.01108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882444.01156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882444.01183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882444.01477: variable '__network_required_facts' from source: role '' defaults 13531 1726882444.01581: variable 'ansible_facts' from source: unknown 13531 1726882444.03220: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13531 1726882444.03356: when evaluation is False, skipping this task 13531 1726882444.03367: _execute() done 13531 1726882444.03374: dumping result to json 13531 1726882444.03382: done dumping result, returning 13531 1726882444.03394: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4fd9-519d-000000000796] 13531 1726882444.03403: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000796 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882444.03551: no more pending results, returning what we have 13531 1726882444.03558: results queue empty 13531 1726882444.03559: checking for any_errors_fatal 13531 1726882444.03561: done checking for any_errors_fatal 13531 1726882444.03561: checking for max_fail_percentage 13531 1726882444.03565: done checking for max_fail_percentage 13531 1726882444.03566: checking to see if all hosts have failed and the running result is not ok 13531 1726882444.03567: done checking to see if all hosts have failed 13531 1726882444.03568: getting the remaining hosts for this loop 13531 1726882444.03569: done getting the remaining hosts for this loop 13531 1726882444.03573: getting the next task for host managed_node2 13531 1726882444.03583: done getting next task for host managed_node2 13531 1726882444.03587: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13531 1726882444.03591: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882444.03613: getting variables 13531 1726882444.03615: in VariableManager get_vars() 13531 1726882444.03680: Calling all_inventory to load vars for managed_node2 13531 1726882444.03683: Calling groups_inventory to load vars for managed_node2 13531 1726882444.03686: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882444.03698: Calling all_plugins_play to load vars for managed_node2 13531 1726882444.03701: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882444.03704: Calling groups_plugins_play to load vars for managed_node2 13531 1726882444.04730: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000796 13531 1726882444.04734: WORKER PROCESS EXITING 13531 1726882444.05640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882444.07528: done with get_vars() 13531 1726882444.07562: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:34:04 -0400 (0:00:00.119) 0:00:31.971 ****** 13531 1726882444.07673: entering _queue_task() for managed_node2/stat 13531 1726882444.07998: worker is 1 (out of 1 available) 13531 1726882444.08010: exiting _queue_task() for managed_node2/stat 13531 1726882444.08023: done queuing things up, now waiting for results queue to drain 13531 1726882444.08024: waiting for pending results... 13531 1726882444.08511: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 13531 1726882444.08687: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000798 13531 1726882444.08705: variable 'ansible_search_path' from source: unknown 13531 1726882444.08712: variable 'ansible_search_path' from source: unknown 13531 1726882444.08762: calling self._execute() 13531 1726882444.08888: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882444.08898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882444.08913: variable 'omit' from source: magic vars 13531 1726882444.09317: variable 'ansible_distribution_major_version' from source: facts 13531 1726882444.09334: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882444.09510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882444.09792: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882444.09865: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882444.09902: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882444.09966: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882444.10117: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882444.10145: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882444.10184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882444.10248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882444.10376: variable '__network_is_ostree' from source: set_fact 13531 1726882444.10387: Evaluated conditional (not __network_is_ostree is defined): False 13531 1726882444.10394: when evaluation is False, skipping this task 13531 1726882444.10400: _execute() done 13531 1726882444.10407: dumping result to json 13531 1726882444.10413: done dumping result, returning 13531 1726882444.10423: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4fd9-519d-000000000798] 13531 1726882444.10433: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000798 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13531 1726882444.10589: no more pending results, returning what we have 13531 1726882444.10593: results queue empty 13531 1726882444.10594: checking for any_errors_fatal 13531 1726882444.10601: done checking for any_errors_fatal 13531 1726882444.10602: checking for max_fail_percentage 13531 1726882444.10604: done checking for max_fail_percentage 13531 1726882444.10605: checking to see if all hosts have failed and the running result is not ok 13531 1726882444.10606: done checking to see if all hosts have failed 13531 1726882444.10606: getting the remaining hosts for this loop 13531 1726882444.10608: done getting the remaining hosts for this loop 13531 1726882444.10611: getting the next task for host managed_node2 13531 1726882444.10618: done getting next task for host managed_node2 13531 1726882444.10622: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13531 1726882444.10626: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882444.10647: getting variables 13531 1726882444.10649: in VariableManager get_vars() 13531 1726882444.10714: Calling all_inventory to load vars for managed_node2 13531 1726882444.10716: Calling groups_inventory to load vars for managed_node2 13531 1726882444.10719: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882444.10731: Calling all_plugins_play to load vars for managed_node2 13531 1726882444.10734: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882444.10737: Calling groups_plugins_play to load vars for managed_node2 13531 1726882444.11684: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000798 13531 1726882444.11688: WORKER PROCESS EXITING 13531 1726882444.12841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882444.14893: done with get_vars() 13531 1726882444.14930: done getting variables 13531 1726882444.15117: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:34:04 -0400 (0:00:00.074) 0:00:32.046 ****** 13531 1726882444.15159: entering _queue_task() for managed_node2/set_fact 13531 1726882444.15993: worker is 1 (out of 1 available) 13531 1726882444.16006: exiting _queue_task() for managed_node2/set_fact 13531 1726882444.16019: done queuing things up, now waiting for results queue to drain 13531 1726882444.16020: waiting for pending results... 13531 1726882444.16501: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13531 1726882444.16652: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000799 13531 1726882444.16668: variable 'ansible_search_path' from source: unknown 13531 1726882444.16672: variable 'ansible_search_path' from source: unknown 13531 1726882444.16716: calling self._execute() 13531 1726882444.16821: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882444.16825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882444.16834: variable 'omit' from source: magic vars 13531 1726882444.17221: variable 'ansible_distribution_major_version' from source: facts 13531 1726882444.17242: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882444.17419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882444.17709: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882444.17752: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882444.17796: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882444.17831: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882444.17999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882444.18035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882444.18065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882444.18122: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882444.18243: variable '__network_is_ostree' from source: set_fact 13531 1726882444.18249: Evaluated conditional (not __network_is_ostree is defined): False 13531 1726882444.18253: when evaluation is False, skipping this task 13531 1726882444.18258: _execute() done 13531 1726882444.18260: dumping result to json 13531 1726882444.18265: done dumping result, returning 13531 1726882444.18273: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4fd9-519d-000000000799] 13531 1726882444.18298: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000799 13531 1726882444.18401: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000799 13531 1726882444.18403: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13531 1726882444.19220: no more pending results, returning what we have 13531 1726882444.19224: results queue empty 13531 1726882444.19225: checking for any_errors_fatal 13531 1726882444.19233: done checking for any_errors_fatal 13531 1726882444.19234: checking for max_fail_percentage 13531 1726882444.19236: done checking for max_fail_percentage 13531 1726882444.19237: checking to see if all hosts have failed and the running result is not ok 13531 1726882444.19238: done checking to see if all hosts have failed 13531 1726882444.19239: getting the remaining hosts for this loop 13531 1726882444.19240: done getting the remaining hosts for this loop 13531 1726882444.19244: getting the next task for host managed_node2 13531 1726882444.19254: done getting next task for host managed_node2 13531 1726882444.19258: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13531 1726882444.19263: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882444.19290: getting variables 13531 1726882444.19292: in VariableManager get_vars() 13531 1726882444.19352: Calling all_inventory to load vars for managed_node2 13531 1726882444.19355: Calling groups_inventory to load vars for managed_node2 13531 1726882444.19358: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882444.19377: Calling all_plugins_play to load vars for managed_node2 13531 1726882444.19380: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882444.19384: Calling groups_plugins_play to load vars for managed_node2 13531 1726882444.21069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882444.23215: done with get_vars() 13531 1726882444.23240: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:34:04 -0400 (0:00:00.081) 0:00:32.128 ****** 13531 1726882444.23349: entering _queue_task() for managed_node2/service_facts 13531 1726882444.23707: worker is 1 (out of 1 available) 13531 1726882444.23720: exiting _queue_task() for managed_node2/service_facts 13531 1726882444.23736: done queuing things up, now waiting for results queue to drain 13531 1726882444.23738: waiting for pending results... 13531 1726882444.24049: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 13531 1726882444.24306: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000079b 13531 1726882444.24312: variable 'ansible_search_path' from source: unknown 13531 1726882444.24316: variable 'ansible_search_path' from source: unknown 13531 1726882444.24329: calling self._execute() 13531 1726882444.24443: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882444.24447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882444.24459: variable 'omit' from source: magic vars 13531 1726882444.24888: variable 'ansible_distribution_major_version' from source: facts 13531 1726882444.24900: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882444.24907: variable 'omit' from source: magic vars 13531 1726882444.24993: variable 'omit' from source: magic vars 13531 1726882444.25027: variable 'omit' from source: magic vars 13531 1726882444.25073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882444.25115: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882444.25136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882444.25153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882444.25167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882444.25205: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882444.25208: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882444.25211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882444.25321: Set connection var ansible_pipelining to False 13531 1726882444.25326: Set connection var ansible_timeout to 10 13531 1726882444.25332: Set connection var ansible_shell_executable to /bin/sh 13531 1726882444.25337: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882444.25340: Set connection var ansible_connection to ssh 13531 1726882444.25342: Set connection var ansible_shell_type to sh 13531 1726882444.25371: variable 'ansible_shell_executable' from source: unknown 13531 1726882444.25374: variable 'ansible_connection' from source: unknown 13531 1726882444.25377: variable 'ansible_module_compression' from source: unknown 13531 1726882444.25380: variable 'ansible_shell_type' from source: unknown 13531 1726882444.25382: variable 'ansible_shell_executable' from source: unknown 13531 1726882444.25384: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882444.25386: variable 'ansible_pipelining' from source: unknown 13531 1726882444.25389: variable 'ansible_timeout' from source: unknown 13531 1726882444.25394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882444.25603: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882444.25613: variable 'omit' from source: magic vars 13531 1726882444.25619: starting attempt loop 13531 1726882444.25629: running the handler 13531 1726882444.25643: _low_level_execute_command(): starting 13531 1726882444.25650: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882444.26482: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882444.26494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882444.26504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882444.26521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882444.26605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882444.26611: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882444.26621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882444.26639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882444.26650: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882444.26667: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882444.26702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882444.26707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882444.26725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882444.26735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882444.26747: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882444.26759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882444.26841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882444.26849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882444.26867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882444.27006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882444.28709: stdout chunk (state=3): >>>/root <<< 13531 1726882444.28851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882444.28858: stdout chunk (state=3): >>><<< 13531 1726882444.28865: stderr chunk (state=3): >>><<< 13531 1726882444.28889: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882444.28904: _low_level_execute_command(): starting 13531 1726882444.28910: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882444.2888918-14935-71941650372640 `" && echo ansible-tmp-1726882444.2888918-14935-71941650372640="` echo /root/.ansible/tmp/ansible-tmp-1726882444.2888918-14935-71941650372640 `" ) && sleep 0' 13531 1726882444.30021: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882444.30276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882444.30282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882444.30296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882444.30337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882444.30372: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882444.30382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882444.30395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882444.30403: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882444.30409: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882444.30417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882444.30426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882444.30438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882444.30445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882444.30456: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882444.30461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882444.30541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882444.30565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882444.30571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882444.30712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882444.32598: stdout chunk (state=3): >>>ansible-tmp-1726882444.2888918-14935-71941650372640=/root/.ansible/tmp/ansible-tmp-1726882444.2888918-14935-71941650372640 <<< 13531 1726882444.32798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882444.32802: stdout chunk (state=3): >>><<< 13531 1726882444.32804: stderr chunk (state=3): >>><<< 13531 1726882444.33139: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882444.2888918-14935-71941650372640=/root/.ansible/tmp/ansible-tmp-1726882444.2888918-14935-71941650372640 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882444.33143: variable 'ansible_module_compression' from source: unknown 13531 1726882444.33146: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13531 1726882444.33149: variable 'ansible_facts' from source: unknown 13531 1726882444.33153: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882444.2888918-14935-71941650372640/AnsiballZ_service_facts.py 13531 1726882444.33225: Sending initial data 13531 1726882444.33229: Sent initial data (161 bytes) 13531 1726882444.33887: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882444.33891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882444.33926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882444.33931: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882444.33940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882444.33945: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882444.33952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882444.33968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882444.33979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882444.34034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882444.34038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882444.34144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882444.35896: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882444.35994: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882444.36091: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpuoeawt59 /root/.ansible/tmp/ansible-tmp-1726882444.2888918-14935-71941650372640/AnsiballZ_service_facts.py <<< 13531 1726882444.36202: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882444.37420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882444.37536: stderr chunk (state=3): >>><<< 13531 1726882444.37540: stdout chunk (state=3): >>><<< 13531 1726882444.37547: done transferring module to remote 13531 1726882444.37559: _low_level_execute_command(): starting 13531 1726882444.37561: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882444.2888918-14935-71941650372640/ /root/.ansible/tmp/ansible-tmp-1726882444.2888918-14935-71941650372640/AnsiballZ_service_facts.py && sleep 0' 13531 1726882444.38018: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882444.38028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882444.38053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882444.38059: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882444.38073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882444.38087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882444.38099: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 13531 1726882444.38105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882444.38113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882444.38118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882444.38123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882444.38173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882444.38194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882444.38197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882444.38313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882444.40171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882444.40269: stderr chunk (state=3): >>><<< 13531 1726882444.40273: stdout chunk (state=3): >>><<< 13531 1726882444.40276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882444.40278: _low_level_execute_command(): starting 13531 1726882444.40280: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882444.2888918-14935-71941650372640/AnsiballZ_service_facts.py && sleep 0' 13531 1726882444.41040: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882444.41057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882444.41078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882444.41103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882444.41145: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882444.41159: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882444.41178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882444.41200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882444.41215: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882444.41242: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882444.41254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882444.41271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882444.41285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882444.41296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882444.41307: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882444.41320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882444.41521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882444.41538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882444.41552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882444.41699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882445.78534: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 13531 1726882445.78626: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "ina<<< 13531 1726882445.78640: stdout chunk (state=3): >>>ctive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13531 1726882445.79892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882445.79896: stderr chunk (state=3): >>><<< 13531 1726882445.79898: stdout chunk (state=3): >>><<< 13531 1726882445.79935: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882445.80810: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882444.2888918-14935-71941650372640/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882445.80814: _low_level_execute_command(): starting 13531 1726882445.80817: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882444.2888918-14935-71941650372640/ > /dev/null 2>&1 && sleep 0' 13531 1726882445.81477: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882445.81481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882445.81524: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882445.81528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882445.81530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882445.81579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882445.81587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882445.81597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882445.81705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882445.84172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882445.84176: stderr chunk (state=3): >>><<< 13531 1726882445.84179: stdout chunk (state=3): >>><<< 13531 1726882445.84183: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882445.84185: handler run complete 13531 1726882445.84187: variable 'ansible_facts' from source: unknown 13531 1726882445.84321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882445.85047: variable 'ansible_facts' from source: unknown 13531 1726882445.85197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882445.85428: attempt loop complete, returning result 13531 1726882445.85439: _execute() done 13531 1726882445.85446: dumping result to json 13531 1726882445.85510: done dumping result, returning 13531 1726882445.85534: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4fd9-519d-00000000079b] 13531 1726882445.85544: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000079b ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882445.86635: no more pending results, returning what we have 13531 1726882445.86637: results queue empty 13531 1726882445.86638: checking for any_errors_fatal 13531 1726882445.86643: done checking for any_errors_fatal 13531 1726882445.86645: checking for max_fail_percentage 13531 1726882445.86647: done checking for max_fail_percentage 13531 1726882445.86648: checking to see if all hosts have failed and the running result is not ok 13531 1726882445.86649: done checking to see if all hosts have failed 13531 1726882445.86650: getting the remaining hosts for this loop 13531 1726882445.86651: done getting the remaining hosts for this loop 13531 1726882445.86654: getting the next task for host managed_node2 13531 1726882445.86659: done getting next task for host managed_node2 13531 1726882445.86662: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13531 1726882445.86667: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882445.86675: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000079b 13531 1726882445.86678: WORKER PROCESS EXITING 13531 1726882445.86685: getting variables 13531 1726882445.86686: in VariableManager get_vars() 13531 1726882445.86720: Calling all_inventory to load vars for managed_node2 13531 1726882445.86721: Calling groups_inventory to load vars for managed_node2 13531 1726882445.86723: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882445.86730: Calling all_plugins_play to load vars for managed_node2 13531 1726882445.86731: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882445.86733: Calling groups_plugins_play to load vars for managed_node2 13531 1726882445.87592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882445.88977: done with get_vars() 13531 1726882445.89001: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:34:05 -0400 (0:00:01.657) 0:00:33.786 ****** 13531 1726882445.89104: entering _queue_task() for managed_node2/package_facts 13531 1726882445.89818: worker is 1 (out of 1 available) 13531 1726882445.89829: exiting _queue_task() for managed_node2/package_facts 13531 1726882445.89843: done queuing things up, now waiting for results queue to drain 13531 1726882445.89844: waiting for pending results... 13531 1726882445.90170: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 13531 1726882445.90313: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000079c 13531 1726882445.90326: variable 'ansible_search_path' from source: unknown 13531 1726882445.90329: variable 'ansible_search_path' from source: unknown 13531 1726882445.90361: calling self._execute() 13531 1726882445.90444: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882445.90448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882445.90456: variable 'omit' from source: magic vars 13531 1726882445.90741: variable 'ansible_distribution_major_version' from source: facts 13531 1726882445.90752: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882445.90760: variable 'omit' from source: magic vars 13531 1726882445.90811: variable 'omit' from source: magic vars 13531 1726882445.90836: variable 'omit' from source: magic vars 13531 1726882445.90877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882445.90901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882445.90917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882445.90930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882445.90943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882445.90969: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882445.90972: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882445.90975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882445.91045: Set connection var ansible_pipelining to False 13531 1726882445.91048: Set connection var ansible_timeout to 10 13531 1726882445.91059: Set connection var ansible_shell_executable to /bin/sh 13531 1726882445.91062: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882445.91066: Set connection var ansible_connection to ssh 13531 1726882445.91069: Set connection var ansible_shell_type to sh 13531 1726882445.91088: variable 'ansible_shell_executable' from source: unknown 13531 1726882445.91090: variable 'ansible_connection' from source: unknown 13531 1726882445.91093: variable 'ansible_module_compression' from source: unknown 13531 1726882445.91095: variable 'ansible_shell_type' from source: unknown 13531 1726882445.91097: variable 'ansible_shell_executable' from source: unknown 13531 1726882445.91101: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882445.91105: variable 'ansible_pipelining' from source: unknown 13531 1726882445.91107: variable 'ansible_timeout' from source: unknown 13531 1726882445.91111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882445.91255: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882445.91271: variable 'omit' from source: magic vars 13531 1726882445.91275: starting attempt loop 13531 1726882445.91278: running the handler 13531 1726882445.91288: _low_level_execute_command(): starting 13531 1726882445.91294: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882445.91793: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882445.91802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882445.91831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882445.91846: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882445.91857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882445.91905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882445.91911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882445.91922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882445.92046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882445.93695: stdout chunk (state=3): >>>/root <<< 13531 1726882445.93801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882445.93843: stderr chunk (state=3): >>><<< 13531 1726882445.93853: stdout chunk (state=3): >>><<< 13531 1726882445.93882: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882445.93891: _low_level_execute_command(): starting 13531 1726882445.93897: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882445.9388053-15019-279113321539733 `" && echo ansible-tmp-1726882445.9388053-15019-279113321539733="` echo /root/.ansible/tmp/ansible-tmp-1726882445.9388053-15019-279113321539733 `" ) && sleep 0' 13531 1726882445.94332: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882445.94337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882445.94379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882445.94402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882445.94436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882445.94449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882445.94555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882445.96492: stdout chunk (state=3): >>>ansible-tmp-1726882445.9388053-15019-279113321539733=/root/.ansible/tmp/ansible-tmp-1726882445.9388053-15019-279113321539733 <<< 13531 1726882445.96594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882445.96647: stderr chunk (state=3): >>><<< 13531 1726882445.96650: stdout chunk (state=3): >>><<< 13531 1726882445.96670: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882445.9388053-15019-279113321539733=/root/.ansible/tmp/ansible-tmp-1726882445.9388053-15019-279113321539733 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882445.96707: variable 'ansible_module_compression' from source: unknown 13531 1726882445.96746: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13531 1726882445.96798: variable 'ansible_facts' from source: unknown 13531 1726882445.96931: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882445.9388053-15019-279113321539733/AnsiballZ_package_facts.py 13531 1726882445.97049: Sending initial data 13531 1726882445.97053: Sent initial data (162 bytes) 13531 1726882445.97730: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882445.97739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882445.97773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882445.97777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882445.97789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882445.97799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882445.97848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882445.97861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882445.97977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882445.99735: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882445.99828: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882445.99927: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpe5dhqzs8 /root/.ansible/tmp/ansible-tmp-1726882445.9388053-15019-279113321539733/AnsiballZ_package_facts.py <<< 13531 1726882446.00021: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882446.02013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882446.02124: stderr chunk (state=3): >>><<< 13531 1726882446.02128: stdout chunk (state=3): >>><<< 13531 1726882446.02143: done transferring module to remote 13531 1726882446.02154: _low_level_execute_command(): starting 13531 1726882446.02162: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882445.9388053-15019-279113321539733/ /root/.ansible/tmp/ansible-tmp-1726882445.9388053-15019-279113321539733/AnsiballZ_package_facts.py && sleep 0' 13531 1726882446.02631: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882446.02635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882446.02673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882446.02686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882446.02738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882446.02750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882446.02855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882446.04638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882446.04693: stderr chunk (state=3): >>><<< 13531 1726882446.04696: stdout chunk (state=3): >>><<< 13531 1726882446.04709: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882446.04718: _low_level_execute_command(): starting 13531 1726882446.04720: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882445.9388053-15019-279113321539733/AnsiballZ_package_facts.py && sleep 0' 13531 1726882446.05153: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882446.05163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882446.05192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882446.05205: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882446.05216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882446.05322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882446.05325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882446.05424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882446.51744: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_<<< 13531 1726882446.51771: stdout chunk (state=3): >>>64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x8<<< 13531 1726882446.51787: stdout chunk (state=3): >>>6_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba<<< 13531 1726882446.51795: stdout chunk (state=3): >>>", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "py<<< 13531 1726882446.51801: stdout chunk (state=3): >>>thon3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epo<<< 13531 1726882446.51819: stdout chunk (state=3): >>>ch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.<<< 13531 1726882446.51868: stdout chunk (state=3): >>>9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rp<<< 13531 1726882446.51887: stdout chunk (state=3): >>>m"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1"<<< 13531 1726882446.51900: stdout chunk (state=3): >>>, "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysin<<< 13531 1726882446.51905: stdout chunk (state=3): >>>it", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "pe<<< 13531 1726882446.51920: stdout chunk (state=3): >>>rl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "<<< 13531 1726882446.51935: stdout chunk (state=3): >>>8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch"<<< 13531 1726882446.51938: stdout chunk (state=3): >>>: null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "relea<<< 13531 1726882446.51945: stdout chunk (state=3): >>>se": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13531 1726882446.53477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882446.53531: stderr chunk (state=3): >>><<< 13531 1726882446.53534: stdout chunk (state=3): >>><<< 13531 1726882446.53578: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882446.55029: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882445.9388053-15019-279113321539733/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882446.55045: _low_level_execute_command(): starting 13531 1726882446.55049: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882445.9388053-15019-279113321539733/ > /dev/null 2>&1 && sleep 0' 13531 1726882446.55544: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882446.55548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882446.55581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882446.55593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882446.55642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882446.55657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882446.55770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882446.57716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882446.57728: stdout chunk (state=3): >>><<< 13531 1726882446.57741: stderr chunk (state=3): >>><<< 13531 1726882446.57772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882446.57784: handler run complete 13531 1726882446.58837: variable 'ansible_facts' from source: unknown 13531 1726882446.59232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882446.60456: variable 'ansible_facts' from source: unknown 13531 1726882446.60860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882446.61670: attempt loop complete, returning result 13531 1726882446.61674: _execute() done 13531 1726882446.61676: dumping result to json 13531 1726882446.61820: done dumping result, returning 13531 1726882446.61833: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4fd9-519d-00000000079c] 13531 1726882446.61835: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000079c 13531 1726882446.64005: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000079c ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882446.64239: no more pending results, returning what we have 13531 1726882446.64242: results queue empty 13531 1726882446.64243: checking for any_errors_fatal 13531 1726882446.64252: done checking for any_errors_fatal 13531 1726882446.64253: checking for max_fail_percentage 13531 1726882446.64254: done checking for max_fail_percentage 13531 1726882446.64255: checking to see if all hosts have failed and the running result is not ok 13531 1726882446.64256: done checking to see if all hosts have failed 13531 1726882446.64257: getting the remaining hosts for this loop 13531 1726882446.64258: done getting the remaining hosts for this loop 13531 1726882446.64262: getting the next task for host managed_node2 13531 1726882446.64272: done getting next task for host managed_node2 13531 1726882446.64276: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13531 1726882446.64279: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882446.64291: getting variables 13531 1726882446.64293: in VariableManager get_vars() 13531 1726882446.64347: Calling all_inventory to load vars for managed_node2 13531 1726882446.64349: Calling groups_inventory to load vars for managed_node2 13531 1726882446.64352: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882446.64365: Calling all_plugins_play to load vars for managed_node2 13531 1726882446.64368: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882446.64371: Calling groups_plugins_play to load vars for managed_node2 13531 1726882446.65308: WORKER PROCESS EXITING 13531 1726882446.66062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882446.67867: done with get_vars() 13531 1726882446.67900: done getting variables 13531 1726882446.67968: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:34:06 -0400 (0:00:00.789) 0:00:34.575 ****** 13531 1726882446.68003: entering _queue_task() for managed_node2/debug 13531 1726882446.68351: worker is 1 (out of 1 available) 13531 1726882446.68371: exiting _queue_task() for managed_node2/debug 13531 1726882446.68384: done queuing things up, now waiting for results queue to drain 13531 1726882446.68385: waiting for pending results... 13531 1726882446.68682: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 13531 1726882446.68831: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000d0 13531 1726882446.68853: variable 'ansible_search_path' from source: unknown 13531 1726882446.68861: variable 'ansible_search_path' from source: unknown 13531 1726882446.68905: calling self._execute() 13531 1726882446.69013: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882446.69029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882446.69045: variable 'omit' from source: magic vars 13531 1726882446.69436: variable 'ansible_distribution_major_version' from source: facts 13531 1726882446.69454: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882446.69472: variable 'omit' from source: magic vars 13531 1726882446.69534: variable 'omit' from source: magic vars 13531 1726882446.69642: variable 'network_provider' from source: set_fact 13531 1726882446.69667: variable 'omit' from source: magic vars 13531 1726882446.69723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882446.69760: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882446.69790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882446.69815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882446.69829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882446.69865: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882446.69874: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882446.69881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882446.69991: Set connection var ansible_pipelining to False 13531 1726882446.70006: Set connection var ansible_timeout to 10 13531 1726882446.70019: Set connection var ansible_shell_executable to /bin/sh 13531 1726882446.70027: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882446.70033: Set connection var ansible_connection to ssh 13531 1726882446.70038: Set connection var ansible_shell_type to sh 13531 1726882446.70072: variable 'ansible_shell_executable' from source: unknown 13531 1726882446.70079: variable 'ansible_connection' from source: unknown 13531 1726882446.70085: variable 'ansible_module_compression' from source: unknown 13531 1726882446.70091: variable 'ansible_shell_type' from source: unknown 13531 1726882446.70096: variable 'ansible_shell_executable' from source: unknown 13531 1726882446.70101: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882446.70109: variable 'ansible_pipelining' from source: unknown 13531 1726882446.70119: variable 'ansible_timeout' from source: unknown 13531 1726882446.70128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882446.70270: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882446.70286: variable 'omit' from source: magic vars 13531 1726882446.70295: starting attempt loop 13531 1726882446.70301: running the handler 13531 1726882446.70356: handler run complete 13531 1726882446.70378: attempt loop complete, returning result 13531 1726882446.70384: _execute() done 13531 1726882446.70390: dumping result to json 13531 1726882446.70396: done dumping result, returning 13531 1726882446.70407: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4fd9-519d-0000000000d0] 13531 1726882446.70417: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d0 ok: [managed_node2] => {} MSG: Using network provider: nm 13531 1726882446.70581: no more pending results, returning what we have 13531 1726882446.70584: results queue empty 13531 1726882446.70585: checking for any_errors_fatal 13531 1726882446.70595: done checking for any_errors_fatal 13531 1726882446.70596: checking for max_fail_percentage 13531 1726882446.70598: done checking for max_fail_percentage 13531 1726882446.70599: checking to see if all hosts have failed and the running result is not ok 13531 1726882446.70600: done checking to see if all hosts have failed 13531 1726882446.70601: getting the remaining hosts for this loop 13531 1726882446.70602: done getting the remaining hosts for this loop 13531 1726882446.70606: getting the next task for host managed_node2 13531 1726882446.70612: done getting next task for host managed_node2 13531 1726882446.70617: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13531 1726882446.70621: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882446.70633: getting variables 13531 1726882446.70635: in VariableManager get_vars() 13531 1726882446.70702: Calling all_inventory to load vars for managed_node2 13531 1726882446.70705: Calling groups_inventory to load vars for managed_node2 13531 1726882446.70707: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882446.70719: Calling all_plugins_play to load vars for managed_node2 13531 1726882446.70722: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882446.70725: Calling groups_plugins_play to load vars for managed_node2 13531 1726882446.71733: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d0 13531 1726882446.71737: WORKER PROCESS EXITING 13531 1726882446.72549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882446.74366: done with get_vars() 13531 1726882446.74399: done getting variables 13531 1726882446.74468: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:34:06 -0400 (0:00:00.064) 0:00:34.640 ****** 13531 1726882446.74503: entering _queue_task() for managed_node2/fail 13531 1726882446.74854: worker is 1 (out of 1 available) 13531 1726882446.74868: exiting _queue_task() for managed_node2/fail 13531 1726882446.74886: done queuing things up, now waiting for results queue to drain 13531 1726882446.74888: waiting for pending results... 13531 1726882446.75207: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13531 1726882446.75384: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000d1 13531 1726882446.75404: variable 'ansible_search_path' from source: unknown 13531 1726882446.75412: variable 'ansible_search_path' from source: unknown 13531 1726882446.75468: calling self._execute() 13531 1726882446.75590: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882446.75604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882446.75617: variable 'omit' from source: magic vars 13531 1726882446.76033: variable 'ansible_distribution_major_version' from source: facts 13531 1726882446.76051: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882446.76180: variable 'network_state' from source: role '' defaults 13531 1726882446.76200: Evaluated conditional (network_state != {}): False 13531 1726882446.76212: when evaluation is False, skipping this task 13531 1726882446.76219: _execute() done 13531 1726882446.76225: dumping result to json 13531 1726882446.76231: done dumping result, returning 13531 1726882446.76241: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4fd9-519d-0000000000d1] 13531 1726882446.76252: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d1 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882446.76410: no more pending results, returning what we have 13531 1726882446.76414: results queue empty 13531 1726882446.76415: checking for any_errors_fatal 13531 1726882446.76422: done checking for any_errors_fatal 13531 1726882446.76422: checking for max_fail_percentage 13531 1726882446.76424: done checking for max_fail_percentage 13531 1726882446.76425: checking to see if all hosts have failed and the running result is not ok 13531 1726882446.76427: done checking to see if all hosts have failed 13531 1726882446.76427: getting the remaining hosts for this loop 13531 1726882446.76429: done getting the remaining hosts for this loop 13531 1726882446.76432: getting the next task for host managed_node2 13531 1726882446.76440: done getting next task for host managed_node2 13531 1726882446.76444: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13531 1726882446.76448: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882446.76473: getting variables 13531 1726882446.76475: in VariableManager get_vars() 13531 1726882446.76536: Calling all_inventory to load vars for managed_node2 13531 1726882446.76539: Calling groups_inventory to load vars for managed_node2 13531 1726882446.76542: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882446.76555: Calling all_plugins_play to load vars for managed_node2 13531 1726882446.76558: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882446.76561: Calling groups_plugins_play to load vars for managed_node2 13531 1726882446.77511: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d1 13531 1726882446.77515: WORKER PROCESS EXITING 13531 1726882446.78526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882446.80421: done with get_vars() 13531 1726882446.80462: done getting variables 13531 1726882446.80527: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:34:06 -0400 (0:00:00.060) 0:00:34.700 ****** 13531 1726882446.80570: entering _queue_task() for managed_node2/fail 13531 1726882446.80935: worker is 1 (out of 1 available) 13531 1726882446.80948: exiting _queue_task() for managed_node2/fail 13531 1726882446.80962: done queuing things up, now waiting for results queue to drain 13531 1726882446.80964: waiting for pending results... 13531 1726882446.81271: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13531 1726882446.81425: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000d2 13531 1726882446.81449: variable 'ansible_search_path' from source: unknown 13531 1726882446.81457: variable 'ansible_search_path' from source: unknown 13531 1726882446.81505: calling self._execute() 13531 1726882446.81618: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882446.81632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882446.81650: variable 'omit' from source: magic vars 13531 1726882446.82023: variable 'ansible_distribution_major_version' from source: facts 13531 1726882446.82042: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882446.82171: variable 'network_state' from source: role '' defaults 13531 1726882446.82187: Evaluated conditional (network_state != {}): False 13531 1726882446.82199: when evaluation is False, skipping this task 13531 1726882446.82205: _execute() done 13531 1726882446.82212: dumping result to json 13531 1726882446.82218: done dumping result, returning 13531 1726882446.82228: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4fd9-519d-0000000000d2] 13531 1726882446.82239: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d2 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882446.82390: no more pending results, returning what we have 13531 1726882446.82395: results queue empty 13531 1726882446.82396: checking for any_errors_fatal 13531 1726882446.82409: done checking for any_errors_fatal 13531 1726882446.82410: checking for max_fail_percentage 13531 1726882446.82412: done checking for max_fail_percentage 13531 1726882446.82413: checking to see if all hosts have failed and the running result is not ok 13531 1726882446.82414: done checking to see if all hosts have failed 13531 1726882446.82415: getting the remaining hosts for this loop 13531 1726882446.82416: done getting the remaining hosts for this loop 13531 1726882446.82420: getting the next task for host managed_node2 13531 1726882446.82427: done getting next task for host managed_node2 13531 1726882446.82432: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13531 1726882446.82436: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882446.82459: getting variables 13531 1726882446.82461: in VariableManager get_vars() 13531 1726882446.82525: Calling all_inventory to load vars for managed_node2 13531 1726882446.82529: Calling groups_inventory to load vars for managed_node2 13531 1726882446.82531: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882446.82545: Calling all_plugins_play to load vars for managed_node2 13531 1726882446.82548: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882446.82552: Calling groups_plugins_play to load vars for managed_node2 13531 1726882446.83519: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d2 13531 1726882446.83524: WORKER PROCESS EXITING 13531 1726882446.84446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882446.86317: done with get_vars() 13531 1726882446.86342: done getting variables 13531 1726882446.86411: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:34:06 -0400 (0:00:00.058) 0:00:34.759 ****** 13531 1726882446.86445: entering _queue_task() for managed_node2/fail 13531 1726882446.86804: worker is 1 (out of 1 available) 13531 1726882446.86816: exiting _queue_task() for managed_node2/fail 13531 1726882446.86830: done queuing things up, now waiting for results queue to drain 13531 1726882446.86831: waiting for pending results... 13531 1726882446.87133: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13531 1726882446.87278: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000d3 13531 1726882446.87297: variable 'ansible_search_path' from source: unknown 13531 1726882446.87305: variable 'ansible_search_path' from source: unknown 13531 1726882446.87348: calling self._execute() 13531 1726882446.87448: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882446.87458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882446.87477: variable 'omit' from source: magic vars 13531 1726882446.87862: variable 'ansible_distribution_major_version' from source: facts 13531 1726882446.87883: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882446.88067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882446.90550: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882446.90625: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882446.90680: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882446.90720: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882446.90835: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882446.90976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882446.91017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882446.91048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882446.91105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882446.91126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882446.91244: variable 'ansible_distribution_major_version' from source: facts 13531 1726882446.91268: Evaluated conditional (ansible_distribution_major_version | int > 9): False 13531 1726882446.91278: when evaluation is False, skipping this task 13531 1726882446.91289: _execute() done 13531 1726882446.91297: dumping result to json 13531 1726882446.91304: done dumping result, returning 13531 1726882446.91321: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4fd9-519d-0000000000d3] 13531 1726882446.91331: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d3 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 13531 1726882446.91485: no more pending results, returning what we have 13531 1726882446.91490: results queue empty 13531 1726882446.91491: checking for any_errors_fatal 13531 1726882446.91498: done checking for any_errors_fatal 13531 1726882446.91499: checking for max_fail_percentage 13531 1726882446.91501: done checking for max_fail_percentage 13531 1726882446.91502: checking to see if all hosts have failed and the running result is not ok 13531 1726882446.91503: done checking to see if all hosts have failed 13531 1726882446.91504: getting the remaining hosts for this loop 13531 1726882446.91506: done getting the remaining hosts for this loop 13531 1726882446.91510: getting the next task for host managed_node2 13531 1726882446.91516: done getting next task for host managed_node2 13531 1726882446.91521: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13531 1726882446.91524: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882446.91547: getting variables 13531 1726882446.91549: in VariableManager get_vars() 13531 1726882446.91615: Calling all_inventory to load vars for managed_node2 13531 1726882446.91619: Calling groups_inventory to load vars for managed_node2 13531 1726882446.91621: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882446.91633: Calling all_plugins_play to load vars for managed_node2 13531 1726882446.91636: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882446.91639: Calling groups_plugins_play to load vars for managed_node2 13531 1726882446.92606: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d3 13531 1726882446.92610: WORKER PROCESS EXITING 13531 1726882446.93189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882446.94811: done with get_vars() 13531 1726882446.94841: done getting variables 13531 1726882446.94901: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:34:06 -0400 (0:00:00.084) 0:00:34.844 ****** 13531 1726882446.94932: entering _queue_task() for managed_node2/dnf 13531 1726882446.95249: worker is 1 (out of 1 available) 13531 1726882446.95261: exiting _queue_task() for managed_node2/dnf 13531 1726882446.95275: done queuing things up, now waiting for results queue to drain 13531 1726882446.95276: waiting for pending results... 13531 1726882446.95552: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13531 1726882446.95646: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000d4 13531 1726882446.95659: variable 'ansible_search_path' from source: unknown 13531 1726882446.95663: variable 'ansible_search_path' from source: unknown 13531 1726882446.95695: calling self._execute() 13531 1726882446.95774: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882446.95778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882446.95786: variable 'omit' from source: magic vars 13531 1726882446.96070: variable 'ansible_distribution_major_version' from source: facts 13531 1726882446.96079: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882446.96218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882446.99116: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882446.99182: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882446.99215: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882446.99249: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882446.99280: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882446.99361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882446.99393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882446.99417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882446.99459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882446.99471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882446.99596: variable 'ansible_distribution' from source: facts 13531 1726882446.99600: variable 'ansible_distribution_major_version' from source: facts 13531 1726882446.99616: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13531 1726882446.99750: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882446.99882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882446.99913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882446.99943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882446.99983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882446.99996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.00036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882447.00059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882447.00083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.00121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882447.00133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.00172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882447.00193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882447.00225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.00269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882447.00279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.00433: variable 'network_connections' from source: task vars 13531 1726882447.00443: variable 'controller_profile' from source: play vars 13531 1726882447.00513: variable 'controller_profile' from source: play vars 13531 1726882447.00523: variable 'controller_device' from source: play vars 13531 1726882447.00583: variable 'controller_device' from source: play vars 13531 1726882447.00592: variable 'port1_profile' from source: play vars 13531 1726882447.00651: variable 'port1_profile' from source: play vars 13531 1726882447.00659: variable 'dhcp_interface1' from source: play vars 13531 1726882447.00719: variable 'dhcp_interface1' from source: play vars 13531 1726882447.00725: variable 'controller_profile' from source: play vars 13531 1726882447.00783: variable 'controller_profile' from source: play vars 13531 1726882447.00790: variable 'port2_profile' from source: play vars 13531 1726882447.00847: variable 'port2_profile' from source: play vars 13531 1726882447.00856: variable 'dhcp_interface2' from source: play vars 13531 1726882447.00913: variable 'dhcp_interface2' from source: play vars 13531 1726882447.00919: variable 'controller_profile' from source: play vars 13531 1726882447.00976: variable 'controller_profile' from source: play vars 13531 1726882447.01570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882447.01574: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882447.01576: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882447.01601: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882447.01626: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882447.01668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882447.01690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882447.01711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.01733: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882447.01791: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882447.02006: variable 'network_connections' from source: task vars 13531 1726882447.02011: variable 'controller_profile' from source: play vars 13531 1726882447.02071: variable 'controller_profile' from source: play vars 13531 1726882447.02078: variable 'controller_device' from source: play vars 13531 1726882447.02134: variable 'controller_device' from source: play vars 13531 1726882447.02142: variable 'port1_profile' from source: play vars 13531 1726882447.02196: variable 'port1_profile' from source: play vars 13531 1726882447.02202: variable 'dhcp_interface1' from source: play vars 13531 1726882447.02259: variable 'dhcp_interface1' from source: play vars 13531 1726882447.02265: variable 'controller_profile' from source: play vars 13531 1726882447.02318: variable 'controller_profile' from source: play vars 13531 1726882447.02326: variable 'port2_profile' from source: play vars 13531 1726882447.02383: variable 'port2_profile' from source: play vars 13531 1726882447.02389: variable 'dhcp_interface2' from source: play vars 13531 1726882447.02440: variable 'dhcp_interface2' from source: play vars 13531 1726882447.02447: variable 'controller_profile' from source: play vars 13531 1726882447.02506: variable 'controller_profile' from source: play vars 13531 1726882447.02535: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882447.02538: when evaluation is False, skipping this task 13531 1726882447.02541: _execute() done 13531 1726882447.02544: dumping result to json 13531 1726882447.02547: done dumping result, returning 13531 1726882447.02557: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-0000000000d4] 13531 1726882447.02560: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d4 13531 1726882447.02659: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d4 13531 1726882447.02662: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882447.02710: no more pending results, returning what we have 13531 1726882447.02713: results queue empty 13531 1726882447.02714: checking for any_errors_fatal 13531 1726882447.02721: done checking for any_errors_fatal 13531 1726882447.02722: checking for max_fail_percentage 13531 1726882447.02724: done checking for max_fail_percentage 13531 1726882447.02724: checking to see if all hosts have failed and the running result is not ok 13531 1726882447.02725: done checking to see if all hosts have failed 13531 1726882447.02726: getting the remaining hosts for this loop 13531 1726882447.02727: done getting the remaining hosts for this loop 13531 1726882447.02731: getting the next task for host managed_node2 13531 1726882447.02737: done getting next task for host managed_node2 13531 1726882447.02740: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13531 1726882447.02743: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882447.02767: getting variables 13531 1726882447.02769: in VariableManager get_vars() 13531 1726882447.02823: Calling all_inventory to load vars for managed_node2 13531 1726882447.02825: Calling groups_inventory to load vars for managed_node2 13531 1726882447.02827: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882447.02838: Calling all_plugins_play to load vars for managed_node2 13531 1726882447.02840: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882447.02842: Calling groups_plugins_play to load vars for managed_node2 13531 1726882447.05539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882447.07447: done with get_vars() 13531 1726882447.07486: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13531 1726882447.07569: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:34:07 -0400 (0:00:00.126) 0:00:34.971 ****** 13531 1726882447.07601: entering _queue_task() for managed_node2/yum 13531 1726882447.07943: worker is 1 (out of 1 available) 13531 1726882447.07959: exiting _queue_task() for managed_node2/yum 13531 1726882447.07974: done queuing things up, now waiting for results queue to drain 13531 1726882447.07975: waiting for pending results... 13531 1726882447.08280: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13531 1726882447.08429: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000d5 13531 1726882447.08448: variable 'ansible_search_path' from source: unknown 13531 1726882447.08458: variable 'ansible_search_path' from source: unknown 13531 1726882447.08503: calling self._execute() 13531 1726882447.08609: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882447.08619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882447.08637: variable 'omit' from source: magic vars 13531 1726882447.09024: variable 'ansible_distribution_major_version' from source: facts 13531 1726882447.09041: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882447.09233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882447.11724: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882447.11803: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882447.11844: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882447.11887: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882447.11922: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882447.12009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882447.12041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882447.12079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.12128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882447.12146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.12261: variable 'ansible_distribution_major_version' from source: facts 13531 1726882447.12285: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13531 1726882447.12292: when evaluation is False, skipping this task 13531 1726882447.12299: _execute() done 13531 1726882447.12305: dumping result to json 13531 1726882447.12311: done dumping result, returning 13531 1726882447.12322: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-0000000000d5] 13531 1726882447.12337: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d5 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13531 1726882447.12506: no more pending results, returning what we have 13531 1726882447.12510: results queue empty 13531 1726882447.12511: checking for any_errors_fatal 13531 1726882447.12519: done checking for any_errors_fatal 13531 1726882447.12520: checking for max_fail_percentage 13531 1726882447.12522: done checking for max_fail_percentage 13531 1726882447.12523: checking to see if all hosts have failed and the running result is not ok 13531 1726882447.12524: done checking to see if all hosts have failed 13531 1726882447.12524: getting the remaining hosts for this loop 13531 1726882447.12526: done getting the remaining hosts for this loop 13531 1726882447.12530: getting the next task for host managed_node2 13531 1726882447.12536: done getting next task for host managed_node2 13531 1726882447.12540: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13531 1726882447.12544: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882447.12572: getting variables 13531 1726882447.12574: in VariableManager get_vars() 13531 1726882447.12633: Calling all_inventory to load vars for managed_node2 13531 1726882447.12635: Calling groups_inventory to load vars for managed_node2 13531 1726882447.12638: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882447.12650: Calling all_plugins_play to load vars for managed_node2 13531 1726882447.12656: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882447.12660: Calling groups_plugins_play to load vars for managed_node2 13531 1726882447.14092: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d5 13531 1726882447.14097: WORKER PROCESS EXITING 13531 1726882447.14548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882447.16335: done with get_vars() 13531 1726882447.16375: done getting variables 13531 1726882447.16437: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:34:07 -0400 (0:00:00.088) 0:00:35.059 ****** 13531 1726882447.16475: entering _queue_task() for managed_node2/fail 13531 1726882447.16818: worker is 1 (out of 1 available) 13531 1726882447.16832: exiting _queue_task() for managed_node2/fail 13531 1726882447.16845: done queuing things up, now waiting for results queue to drain 13531 1726882447.16847: waiting for pending results... 13531 1726882447.17143: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13531 1726882447.17299: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000d6 13531 1726882447.17319: variable 'ansible_search_path' from source: unknown 13531 1726882447.17327: variable 'ansible_search_path' from source: unknown 13531 1726882447.17373: calling self._execute() 13531 1726882447.17477: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882447.17488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882447.17503: variable 'omit' from source: magic vars 13531 1726882447.17887: variable 'ansible_distribution_major_version' from source: facts 13531 1726882447.17906: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882447.18027: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882447.18214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882447.27049: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882447.27105: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882447.27130: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882447.27158: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882447.27178: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882447.27228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882447.27250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882447.27270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.27296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882447.27307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.27337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882447.27356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882447.27375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.27399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882447.27410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.27436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882447.27452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882447.27472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.27498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882447.27545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.27667: variable 'network_connections' from source: task vars 13531 1726882447.27677: variable 'controller_profile' from source: play vars 13531 1726882447.27741: variable 'controller_profile' from source: play vars 13531 1726882447.27749: variable 'controller_device' from source: play vars 13531 1726882447.27986: variable 'controller_device' from source: play vars 13531 1726882447.27989: variable 'port1_profile' from source: play vars 13531 1726882447.27992: variable 'port1_profile' from source: play vars 13531 1726882447.27994: variable 'dhcp_interface1' from source: play vars 13531 1726882447.27996: variable 'dhcp_interface1' from source: play vars 13531 1726882447.27998: variable 'controller_profile' from source: play vars 13531 1726882447.28000: variable 'controller_profile' from source: play vars 13531 1726882447.28002: variable 'port2_profile' from source: play vars 13531 1726882447.28095: variable 'port2_profile' from source: play vars 13531 1726882447.28098: variable 'dhcp_interface2' from source: play vars 13531 1726882447.28204: variable 'dhcp_interface2' from source: play vars 13531 1726882447.28207: variable 'controller_profile' from source: play vars 13531 1726882447.28209: variable 'controller_profile' from source: play vars 13531 1726882447.28244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882447.28422: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882447.28464: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882447.28494: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882447.28522: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882447.28565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882447.28587: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882447.28611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.28639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882447.28688: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882447.28918: variable 'network_connections' from source: task vars 13531 1726882447.28921: variable 'controller_profile' from source: play vars 13531 1726882447.28986: variable 'controller_profile' from source: play vars 13531 1726882447.28992: variable 'controller_device' from source: play vars 13531 1726882447.29052: variable 'controller_device' from source: play vars 13531 1726882447.29060: variable 'port1_profile' from source: play vars 13531 1726882447.29118: variable 'port1_profile' from source: play vars 13531 1726882447.29124: variable 'dhcp_interface1' from source: play vars 13531 1726882447.29182: variable 'dhcp_interface1' from source: play vars 13531 1726882447.29186: variable 'controller_profile' from source: play vars 13531 1726882447.29242: variable 'controller_profile' from source: play vars 13531 1726882447.29249: variable 'port2_profile' from source: play vars 13531 1726882447.29306: variable 'port2_profile' from source: play vars 13531 1726882447.29313: variable 'dhcp_interface2' from source: play vars 13531 1726882447.29371: variable 'dhcp_interface2' from source: play vars 13531 1726882447.29377: variable 'controller_profile' from source: play vars 13531 1726882447.29434: variable 'controller_profile' from source: play vars 13531 1726882447.29470: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882447.29474: when evaluation is False, skipping this task 13531 1726882447.29476: _execute() done 13531 1726882447.29479: dumping result to json 13531 1726882447.29481: done dumping result, returning 13531 1726882447.29489: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-0000000000d6] 13531 1726882447.29493: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d6 13531 1726882447.29587: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d6 13531 1726882447.29590: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882447.29650: no more pending results, returning what we have 13531 1726882447.29653: results queue empty 13531 1726882447.29656: checking for any_errors_fatal 13531 1726882447.29664: done checking for any_errors_fatal 13531 1726882447.29665: checking for max_fail_percentage 13531 1726882447.29667: done checking for max_fail_percentage 13531 1726882447.29667: checking to see if all hosts have failed and the running result is not ok 13531 1726882447.29668: done checking to see if all hosts have failed 13531 1726882447.29669: getting the remaining hosts for this loop 13531 1726882447.29670: done getting the remaining hosts for this loop 13531 1726882447.29673: getting the next task for host managed_node2 13531 1726882447.29678: done getting next task for host managed_node2 13531 1726882447.29682: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13531 1726882447.29685: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882447.29703: getting variables 13531 1726882447.29704: in VariableManager get_vars() 13531 1726882447.29756: Calling all_inventory to load vars for managed_node2 13531 1726882447.29759: Calling groups_inventory to load vars for managed_node2 13531 1726882447.29761: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882447.29772: Calling all_plugins_play to load vars for managed_node2 13531 1726882447.29775: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882447.29777: Calling groups_plugins_play to load vars for managed_node2 13531 1726882447.35959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882447.36887: done with get_vars() 13531 1726882447.36911: done getting variables 13531 1726882447.36949: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:34:07 -0400 (0:00:00.204) 0:00:35.264 ****** 13531 1726882447.36973: entering _queue_task() for managed_node2/package 13531 1726882447.37217: worker is 1 (out of 1 available) 13531 1726882447.37231: exiting _queue_task() for managed_node2/package 13531 1726882447.37244: done queuing things up, now waiting for results queue to drain 13531 1726882447.37247: waiting for pending results... 13531 1726882447.37452: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 13531 1726882447.37652: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000d7 13531 1726882447.37660: variable 'ansible_search_path' from source: unknown 13531 1726882447.37665: variable 'ansible_search_path' from source: unknown 13531 1726882447.37789: calling self._execute() 13531 1726882447.37999: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882447.38004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882447.38015: variable 'omit' from source: magic vars 13531 1726882447.38512: variable 'ansible_distribution_major_version' from source: facts 13531 1726882447.38531: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882447.38728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882447.39031: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882447.39076: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882447.39142: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882447.39206: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882447.39532: variable 'network_packages' from source: role '' defaults 13531 1726882447.39704: variable '__network_provider_setup' from source: role '' defaults 13531 1726882447.39728: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882447.39814: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882447.39835: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882447.39909: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882447.40113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882447.42538: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882447.42629: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882447.42683: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882447.42728: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882447.42781: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882447.42874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882447.42912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882447.42944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.42996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882447.43018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.43076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882447.43105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882447.43140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.43189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882447.43208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.43473: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13531 1726882447.43598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882447.43627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882447.43658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.43708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882447.43728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.43831: variable 'ansible_python' from source: facts 13531 1726882447.43870: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13531 1726882447.43967: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882447.44058: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882447.44196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882447.44231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882447.44267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.44316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882447.44366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.44417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882447.44465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882447.44496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.44540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882447.44570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.44725: variable 'network_connections' from source: task vars 13531 1726882447.44738: variable 'controller_profile' from source: play vars 13531 1726882447.44851: variable 'controller_profile' from source: play vars 13531 1726882447.44873: variable 'controller_device' from source: play vars 13531 1726882447.44982: variable 'controller_device' from source: play vars 13531 1726882447.45005: variable 'port1_profile' from source: play vars 13531 1726882447.45114: variable 'port1_profile' from source: play vars 13531 1726882447.45128: variable 'dhcp_interface1' from source: play vars 13531 1726882447.45236: variable 'dhcp_interface1' from source: play vars 13531 1726882447.45251: variable 'controller_profile' from source: play vars 13531 1726882447.45362: variable 'controller_profile' from source: play vars 13531 1726882447.45380: variable 'port2_profile' from source: play vars 13531 1726882447.45488: variable 'port2_profile' from source: play vars 13531 1726882447.45503: variable 'dhcp_interface2' from source: play vars 13531 1726882447.45610: variable 'dhcp_interface2' from source: play vars 13531 1726882447.45624: variable 'controller_profile' from source: play vars 13531 1726882447.45730: variable 'controller_profile' from source: play vars 13531 1726882447.45819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882447.45852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882447.45896: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.45932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882447.45996: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882447.46314: variable 'network_connections' from source: task vars 13531 1726882447.46323: variable 'controller_profile' from source: play vars 13531 1726882447.46430: variable 'controller_profile' from source: play vars 13531 1726882447.46443: variable 'controller_device' from source: play vars 13531 1726882447.46550: variable 'controller_device' from source: play vars 13531 1726882447.46572: variable 'port1_profile' from source: play vars 13531 1726882447.46680: variable 'port1_profile' from source: play vars 13531 1726882447.46695: variable 'dhcp_interface1' from source: play vars 13531 1726882447.46802: variable 'dhcp_interface1' from source: play vars 13531 1726882447.46816: variable 'controller_profile' from source: play vars 13531 1726882447.46923: variable 'controller_profile' from source: play vars 13531 1726882447.46937: variable 'port2_profile' from source: play vars 13531 1726882447.47043: variable 'port2_profile' from source: play vars 13531 1726882447.47060: variable 'dhcp_interface2' from source: play vars 13531 1726882447.47167: variable 'dhcp_interface2' from source: play vars 13531 1726882447.47186: variable 'controller_profile' from source: play vars 13531 1726882447.47294: variable 'controller_profile' from source: play vars 13531 1726882447.47353: variable '__network_packages_default_wireless' from source: role '' defaults 13531 1726882447.47443: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882447.47791: variable 'network_connections' from source: task vars 13531 1726882447.47801: variable 'controller_profile' from source: play vars 13531 1726882447.47877: variable 'controller_profile' from source: play vars 13531 1726882447.47889: variable 'controller_device' from source: play vars 13531 1726882447.47961: variable 'controller_device' from source: play vars 13531 1726882447.47978: variable 'port1_profile' from source: play vars 13531 1726882447.48042: variable 'port1_profile' from source: play vars 13531 1726882447.48065: variable 'dhcp_interface1' from source: play vars 13531 1726882447.48130: variable 'dhcp_interface1' from source: play vars 13531 1726882447.48141: variable 'controller_profile' from source: play vars 13531 1726882447.48215: variable 'controller_profile' from source: play vars 13531 1726882447.48227: variable 'port2_profile' from source: play vars 13531 1726882447.48299: variable 'port2_profile' from source: play vars 13531 1726882447.48310: variable 'dhcp_interface2' from source: play vars 13531 1726882447.48388: variable 'dhcp_interface2' from source: play vars 13531 1726882447.48400: variable 'controller_profile' from source: play vars 13531 1726882447.48469: variable 'controller_profile' from source: play vars 13531 1726882447.48506: variable '__network_packages_default_team' from source: role '' defaults 13531 1726882447.48593: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882447.49187: variable 'network_connections' from source: task vars 13531 1726882447.49199: variable 'controller_profile' from source: play vars 13531 1726882447.49291: variable 'controller_profile' from source: play vars 13531 1726882447.49304: variable 'controller_device' from source: play vars 13531 1726882447.49378: variable 'controller_device' from source: play vars 13531 1726882447.49392: variable 'port1_profile' from source: play vars 13531 1726882447.49460: variable 'port1_profile' from source: play vars 13531 1726882447.49478: variable 'dhcp_interface1' from source: play vars 13531 1726882447.49543: variable 'dhcp_interface1' from source: play vars 13531 1726882447.49558: variable 'controller_profile' from source: play vars 13531 1726882447.49643: variable 'controller_profile' from source: play vars 13531 1726882447.49658: variable 'port2_profile' from source: play vars 13531 1726882447.49732: variable 'port2_profile' from source: play vars 13531 1726882447.49745: variable 'dhcp_interface2' from source: play vars 13531 1726882447.49822: variable 'dhcp_interface2' from source: play vars 13531 1726882447.49833: variable 'controller_profile' from source: play vars 13531 1726882447.49903: variable 'controller_profile' from source: play vars 13531 1726882447.50021: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882447.50090: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882447.50102: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882447.50173: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882447.50427: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13531 1726882447.52536: variable 'network_connections' from source: task vars 13531 1726882447.52549: variable 'controller_profile' from source: play vars 13531 1726882447.52643: variable 'controller_profile' from source: play vars 13531 1726882447.52814: variable 'controller_device' from source: play vars 13531 1726882447.52879: variable 'controller_device' from source: play vars 13531 1726882447.52897: variable 'port1_profile' from source: play vars 13531 1726882447.53086: variable 'port1_profile' from source: play vars 13531 1726882447.53099: variable 'dhcp_interface1' from source: play vars 13531 1726882447.53284: variable 'dhcp_interface1' from source: play vars 13531 1726882447.53295: variable 'controller_profile' from source: play vars 13531 1726882447.53403: variable 'controller_profile' from source: play vars 13531 1726882447.53481: variable 'port2_profile' from source: play vars 13531 1726882447.53542: variable 'port2_profile' from source: play vars 13531 1726882447.53629: variable 'dhcp_interface2' from source: play vars 13531 1726882447.53757: variable 'dhcp_interface2' from source: play vars 13531 1726882447.53822: variable 'controller_profile' from source: play vars 13531 1726882447.54033: variable 'controller_profile' from source: play vars 13531 1726882447.54047: variable 'ansible_distribution' from source: facts 13531 1726882447.54056: variable '__network_rh_distros' from source: role '' defaults 13531 1726882447.54083: variable 'ansible_distribution_major_version' from source: facts 13531 1726882447.54120: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13531 1726882447.54877: variable 'ansible_distribution' from source: facts 13531 1726882447.54946: variable '__network_rh_distros' from source: role '' defaults 13531 1726882447.55081: variable 'ansible_distribution_major_version' from source: facts 13531 1726882447.55147: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13531 1726882447.55835: variable 'ansible_distribution' from source: facts 13531 1726882447.55852: variable '__network_rh_distros' from source: role '' defaults 13531 1726882447.55880: variable 'ansible_distribution_major_version' from source: facts 13531 1726882447.55966: variable 'network_provider' from source: set_fact 13531 1726882447.56026: variable 'ansible_facts' from source: unknown 13531 1726882447.57404: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13531 1726882447.57414: when evaluation is False, skipping this task 13531 1726882447.57422: _execute() done 13531 1726882447.57428: dumping result to json 13531 1726882447.57434: done dumping result, returning 13531 1726882447.57446: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4fd9-519d-0000000000d7] 13531 1726882447.57474: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d7 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13531 1726882447.57635: no more pending results, returning what we have 13531 1726882447.57638: results queue empty 13531 1726882447.57639: checking for any_errors_fatal 13531 1726882447.57649: done checking for any_errors_fatal 13531 1726882447.57650: checking for max_fail_percentage 13531 1726882447.57652: done checking for max_fail_percentage 13531 1726882447.57653: checking to see if all hosts have failed and the running result is not ok 13531 1726882447.57656: done checking to see if all hosts have failed 13531 1726882447.57657: getting the remaining hosts for this loop 13531 1726882447.57658: done getting the remaining hosts for this loop 13531 1726882447.57662: getting the next task for host managed_node2 13531 1726882447.57669: done getting next task for host managed_node2 13531 1726882447.57673: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13531 1726882447.57676: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882447.57699: getting variables 13531 1726882447.57701: in VariableManager get_vars() 13531 1726882447.57755: Calling all_inventory to load vars for managed_node2 13531 1726882447.57758: Calling groups_inventory to load vars for managed_node2 13531 1726882447.57760: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882447.57774: Calling all_plugins_play to load vars for managed_node2 13531 1726882447.57777: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882447.57780: Calling groups_plugins_play to load vars for managed_node2 13531 1726882447.58473: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d7 13531 1726882447.58476: WORKER PROCESS EXITING 13531 1726882447.59619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882447.62017: done with get_vars() 13531 1726882447.62048: done getting variables 13531 1726882447.62114: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:34:07 -0400 (0:00:00.251) 0:00:35.516 ****** 13531 1726882447.62147: entering _queue_task() for managed_node2/package 13531 1726882447.62479: worker is 1 (out of 1 available) 13531 1726882447.62492: exiting _queue_task() for managed_node2/package 13531 1726882447.62505: done queuing things up, now waiting for results queue to drain 13531 1726882447.62506: waiting for pending results... 13531 1726882447.63655: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13531 1726882447.63890: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000d8 13531 1726882447.64123: variable 'ansible_search_path' from source: unknown 13531 1726882447.64131: variable 'ansible_search_path' from source: unknown 13531 1726882447.64175: calling self._execute() 13531 1726882447.64275: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882447.64993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882447.65008: variable 'omit' from source: magic vars 13531 1726882447.65385: variable 'ansible_distribution_major_version' from source: facts 13531 1726882447.65419: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882447.65542: variable 'network_state' from source: role '' defaults 13531 1726882447.65559: Evaluated conditional (network_state != {}): False 13531 1726882447.65571: when evaluation is False, skipping this task 13531 1726882447.65579: _execute() done 13531 1726882447.65585: dumping result to json 13531 1726882447.65593: done dumping result, returning 13531 1726882447.65606: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4fd9-519d-0000000000d8] 13531 1726882447.65618: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d8 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882447.65789: no more pending results, returning what we have 13531 1726882447.65794: results queue empty 13531 1726882447.65795: checking for any_errors_fatal 13531 1726882447.65803: done checking for any_errors_fatal 13531 1726882447.65804: checking for max_fail_percentage 13531 1726882447.65806: done checking for max_fail_percentage 13531 1726882447.65807: checking to see if all hosts have failed and the running result is not ok 13531 1726882447.65808: done checking to see if all hosts have failed 13531 1726882447.65809: getting the remaining hosts for this loop 13531 1726882447.65810: done getting the remaining hosts for this loop 13531 1726882447.65814: getting the next task for host managed_node2 13531 1726882447.65821: done getting next task for host managed_node2 13531 1726882447.65825: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13531 1726882447.65829: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882447.65851: getting variables 13531 1726882447.65854: in VariableManager get_vars() 13531 1726882447.65918: Calling all_inventory to load vars for managed_node2 13531 1726882447.65922: Calling groups_inventory to load vars for managed_node2 13531 1726882447.65924: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882447.65937: Calling all_plugins_play to load vars for managed_node2 13531 1726882447.65940: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882447.65943: Calling groups_plugins_play to load vars for managed_node2 13531 1726882447.67173: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d8 13531 1726882447.67177: WORKER PROCESS EXITING 13531 1726882447.68822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882447.72438: done with get_vars() 13531 1726882447.72475: done getting variables 13531 1726882447.72538: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:34:07 -0400 (0:00:00.104) 0:00:35.620 ****** 13531 1726882447.72574: entering _queue_task() for managed_node2/package 13531 1726882447.73220: worker is 1 (out of 1 available) 13531 1726882447.73231: exiting _queue_task() for managed_node2/package 13531 1726882447.73245: done queuing things up, now waiting for results queue to drain 13531 1726882447.73247: waiting for pending results... 13531 1726882447.74275: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13531 1726882447.74539: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000d9 13531 1726882447.74560: variable 'ansible_search_path' from source: unknown 13531 1726882447.74640: variable 'ansible_search_path' from source: unknown 13531 1726882447.74687: calling self._execute() 13531 1726882447.74906: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882447.74917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882447.74930: variable 'omit' from source: magic vars 13531 1726882447.75675: variable 'ansible_distribution_major_version' from source: facts 13531 1726882447.75740: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882447.76076: variable 'network_state' from source: role '' defaults 13531 1726882447.76093: Evaluated conditional (network_state != {}): False 13531 1726882447.76102: when evaluation is False, skipping this task 13531 1726882447.76110: _execute() done 13531 1726882447.76116: dumping result to json 13531 1726882447.76123: done dumping result, returning 13531 1726882447.76135: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4fd9-519d-0000000000d9] 13531 1726882447.76146: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d9 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882447.76320: no more pending results, returning what we have 13531 1726882447.76324: results queue empty 13531 1726882447.76325: checking for any_errors_fatal 13531 1726882447.76333: done checking for any_errors_fatal 13531 1726882447.76334: checking for max_fail_percentage 13531 1726882447.76336: done checking for max_fail_percentage 13531 1726882447.76337: checking to see if all hosts have failed and the running result is not ok 13531 1726882447.76337: done checking to see if all hosts have failed 13531 1726882447.76338: getting the remaining hosts for this loop 13531 1726882447.76340: done getting the remaining hosts for this loop 13531 1726882447.76343: getting the next task for host managed_node2 13531 1726882447.76350: done getting next task for host managed_node2 13531 1726882447.76354: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13531 1726882447.76358: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882447.76382: getting variables 13531 1726882447.76385: in VariableManager get_vars() 13531 1726882447.76447: Calling all_inventory to load vars for managed_node2 13531 1726882447.76451: Calling groups_inventory to load vars for managed_node2 13531 1726882447.76453: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882447.76469: Calling all_plugins_play to load vars for managed_node2 13531 1726882447.76472: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882447.76476: Calling groups_plugins_play to load vars for managed_node2 13531 1726882447.78835: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000d9 13531 1726882447.78840: WORKER PROCESS EXITING 13531 1726882447.79835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882447.84019: done with get_vars() 13531 1726882447.84053: done getting variables 13531 1726882447.84117: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:34:07 -0400 (0:00:00.115) 0:00:35.736 ****** 13531 1726882447.84152: entering _queue_task() for managed_node2/service 13531 1726882447.84492: worker is 1 (out of 1 available) 13531 1726882447.84505: exiting _queue_task() for managed_node2/service 13531 1726882447.84518: done queuing things up, now waiting for results queue to drain 13531 1726882447.84519: waiting for pending results... 13531 1726882447.85459: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13531 1726882447.85831: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000da 13531 1726882447.85853: variable 'ansible_search_path' from source: unknown 13531 1726882447.85862: variable 'ansible_search_path' from source: unknown 13531 1726882447.85910: calling self._execute() 13531 1726882447.86139: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882447.86271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882447.86286: variable 'omit' from source: magic vars 13531 1726882447.87002: variable 'ansible_distribution_major_version' from source: facts 13531 1726882447.87024: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882447.87260: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882447.87657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882447.92354: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882447.92453: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882447.92565: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882447.92671: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882447.92704: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882447.92908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882447.92943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882447.93100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.93145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882447.93166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.93222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882447.93314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882447.93424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.93471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882447.93492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.93656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882447.93690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882447.93725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.93770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882447.93848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882447.94226: variable 'network_connections' from source: task vars 13531 1726882447.94285: variable 'controller_profile' from source: play vars 13531 1726882447.94359: variable 'controller_profile' from source: play vars 13531 1726882447.94502: variable 'controller_device' from source: play vars 13531 1726882447.94600: variable 'controller_device' from source: play vars 13531 1726882447.94617: variable 'port1_profile' from source: play vars 13531 1726882447.94766: variable 'port1_profile' from source: play vars 13531 1726882447.94779: variable 'dhcp_interface1' from source: play vars 13531 1726882447.94955: variable 'dhcp_interface1' from source: play vars 13531 1726882447.94971: variable 'controller_profile' from source: play vars 13531 1726882447.95147: variable 'controller_profile' from source: play vars 13531 1726882447.95160: variable 'port2_profile' from source: play vars 13531 1726882447.95222: variable 'port2_profile' from source: play vars 13531 1726882447.95235: variable 'dhcp_interface2' from source: play vars 13531 1726882447.95393: variable 'dhcp_interface2' from source: play vars 13531 1726882447.95405: variable 'controller_profile' from source: play vars 13531 1726882447.95575: variable 'controller_profile' from source: play vars 13531 1726882447.95652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882447.96121: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882447.96169: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882447.96204: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882447.96359: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882447.96412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882447.96444: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882447.96575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882447.96605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882447.96693: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882447.97325: variable 'network_connections' from source: task vars 13531 1726882447.97335: variable 'controller_profile' from source: play vars 13531 1726882447.97401: variable 'controller_profile' from source: play vars 13531 1726882447.97477: variable 'controller_device' from source: play vars 13531 1726882447.97543: variable 'controller_device' from source: play vars 13531 1726882447.97639: variable 'port1_profile' from source: play vars 13531 1726882447.97701: variable 'port1_profile' from source: play vars 13531 1726882447.97747: variable 'dhcp_interface1' from source: play vars 13531 1726882447.97810: variable 'dhcp_interface1' from source: play vars 13531 1726882447.97878: variable 'controller_profile' from source: play vars 13531 1726882447.97938: variable 'controller_profile' from source: play vars 13531 1726882447.98074: variable 'port2_profile' from source: play vars 13531 1726882447.98135: variable 'port2_profile' from source: play vars 13531 1726882447.98178: variable 'dhcp_interface2' from source: play vars 13531 1726882447.98238: variable 'dhcp_interface2' from source: play vars 13531 1726882447.98397: variable 'controller_profile' from source: play vars 13531 1726882447.98458: variable 'controller_profile' from source: play vars 13531 1726882447.98507: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882447.98610: when evaluation is False, skipping this task 13531 1726882447.98618: _execute() done 13531 1726882447.98625: dumping result to json 13531 1726882447.98631: done dumping result, returning 13531 1726882447.98643: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-0000000000da] 13531 1726882447.98653: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000da skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882447.98801: no more pending results, returning what we have 13531 1726882447.98805: results queue empty 13531 1726882447.98806: checking for any_errors_fatal 13531 1726882447.98813: done checking for any_errors_fatal 13531 1726882447.98813: checking for max_fail_percentage 13531 1726882447.98815: done checking for max_fail_percentage 13531 1726882447.98816: checking to see if all hosts have failed and the running result is not ok 13531 1726882447.98817: done checking to see if all hosts have failed 13531 1726882447.98818: getting the remaining hosts for this loop 13531 1726882447.98819: done getting the remaining hosts for this loop 13531 1726882447.98823: getting the next task for host managed_node2 13531 1726882447.98830: done getting next task for host managed_node2 13531 1726882447.98834: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13531 1726882447.98837: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882447.98857: getting variables 13531 1726882447.98859: in VariableManager get_vars() 13531 1726882447.98920: Calling all_inventory to load vars for managed_node2 13531 1726882447.98923: Calling groups_inventory to load vars for managed_node2 13531 1726882447.98926: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882447.98937: Calling all_plugins_play to load vars for managed_node2 13531 1726882447.98940: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882447.98944: Calling groups_plugins_play to load vars for managed_node2 13531 1726882448.00219: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000da 13531 1726882448.00223: WORKER PROCESS EXITING 13531 1726882448.01607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882448.04995: done with get_vars() 13531 1726882448.05021: done getting variables 13531 1726882448.05084: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:34:08 -0400 (0:00:00.209) 0:00:35.946 ****** 13531 1726882448.05123: entering _queue_task() for managed_node2/service 13531 1726882448.05699: worker is 1 (out of 1 available) 13531 1726882448.05712: exiting _queue_task() for managed_node2/service 13531 1726882448.05725: done queuing things up, now waiting for results queue to drain 13531 1726882448.05726: waiting for pending results... 13531 1726882448.07162: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13531 1726882448.07809: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000db 13531 1726882448.07825: variable 'ansible_search_path' from source: unknown 13531 1726882448.07828: variable 'ansible_search_path' from source: unknown 13531 1726882448.07883: calling self._execute() 13531 1726882448.07988: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882448.07992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882448.08003: variable 'omit' from source: magic vars 13531 1726882448.08705: variable 'ansible_distribution_major_version' from source: facts 13531 1726882448.08709: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882448.08882: variable 'network_provider' from source: set_fact 13531 1726882448.08887: variable 'network_state' from source: role '' defaults 13531 1726882448.08898: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13531 1726882448.08905: variable 'omit' from source: magic vars 13531 1726882448.08971: variable 'omit' from source: magic vars 13531 1726882448.09003: variable 'network_service_name' from source: role '' defaults 13531 1726882448.09076: variable 'network_service_name' from source: role '' defaults 13531 1726882448.10190: variable '__network_provider_setup' from source: role '' defaults 13531 1726882448.10194: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882448.10264: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882448.10273: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882448.10335: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882448.10575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882448.15726: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882448.15802: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882448.15839: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882448.16395: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882448.16425: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882448.16512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882448.16541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882448.16574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882448.16614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882448.16629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882448.16981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882448.17004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882448.17029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882448.17073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882448.17087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882448.17423: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13531 1726882448.17751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882448.17783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882448.17804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882448.17844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882448.17860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882448.18069: variable 'ansible_python' from source: facts 13531 1726882448.18294: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13531 1726882448.18420: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882448.18700: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882448.18832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882448.18859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882448.19092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882448.19132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882448.19147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882448.19199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882448.19223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882448.19246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882448.19494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882448.19509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882448.19657: variable 'network_connections' from source: task vars 13531 1726882448.19672: variable 'controller_profile' from source: play vars 13531 1726882448.19954: variable 'controller_profile' from source: play vars 13531 1726882448.19971: variable 'controller_device' from source: play vars 13531 1726882448.20042: variable 'controller_device' from source: play vars 13531 1726882448.20057: variable 'port1_profile' from source: play vars 13531 1726882448.20285: variable 'port1_profile' from source: play vars 13531 1726882448.20424: variable 'dhcp_interface1' from source: play vars 13531 1726882448.21196: variable 'dhcp_interface1' from source: play vars 13531 1726882448.21212: variable 'controller_profile' from source: play vars 13531 1726882448.21293: variable 'controller_profile' from source: play vars 13531 1726882448.21304: variable 'port2_profile' from source: play vars 13531 1726882448.22591: variable 'port2_profile' from source: play vars 13531 1726882448.22602: variable 'dhcp_interface2' from source: play vars 13531 1726882448.22698: variable 'dhcp_interface2' from source: play vars 13531 1726882448.22707: variable 'controller_profile' from source: play vars 13531 1726882448.23448: variable 'controller_profile' from source: play vars 13531 1726882448.23694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882448.24088: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882448.24136: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882448.24182: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882448.24220: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882448.24288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882448.24316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882448.24349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882448.24384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882448.24437: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882448.25466: variable 'network_connections' from source: task vars 13531 1726882448.25470: variable 'controller_profile' from source: play vars 13531 1726882448.25553: variable 'controller_profile' from source: play vars 13531 1726882448.25573: variable 'controller_device' from source: play vars 13531 1726882448.25633: variable 'controller_device' from source: play vars 13531 1726882448.25646: variable 'port1_profile' from source: play vars 13531 1726882448.25719: variable 'port1_profile' from source: play vars 13531 1726882448.25730: variable 'dhcp_interface1' from source: play vars 13531 1726882448.25809: variable 'dhcp_interface1' from source: play vars 13531 1726882448.25812: variable 'controller_profile' from source: play vars 13531 1726882448.25885: variable 'controller_profile' from source: play vars 13531 1726882448.25896: variable 'port2_profile' from source: play vars 13531 1726882448.25967: variable 'port2_profile' from source: play vars 13531 1726882448.27290: variable 'dhcp_interface2' from source: play vars 13531 1726882448.27379: variable 'dhcp_interface2' from source: play vars 13531 1726882448.27389: variable 'controller_profile' from source: play vars 13531 1726882448.27466: variable 'controller_profile' from source: play vars 13531 1726882448.27520: variable '__network_packages_default_wireless' from source: role '' defaults 13531 1726882448.27603: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882448.27908: variable 'network_connections' from source: task vars 13531 1726882448.27913: variable 'controller_profile' from source: play vars 13531 1726882448.28262: variable 'controller_profile' from source: play vars 13531 1726882448.28267: variable 'controller_device' from source: play vars 13531 1726882448.28270: variable 'controller_device' from source: play vars 13531 1726882448.28272: variable 'port1_profile' from source: play vars 13531 1726882448.28403: variable 'port1_profile' from source: play vars 13531 1726882448.28406: variable 'dhcp_interface1' from source: play vars 13531 1726882448.28409: variable 'dhcp_interface1' from source: play vars 13531 1726882448.28411: variable 'controller_profile' from source: play vars 13531 1726882448.28453: variable 'controller_profile' from source: play vars 13531 1726882448.28462: variable 'port2_profile' from source: play vars 13531 1726882448.28527: variable 'port2_profile' from source: play vars 13531 1726882448.28533: variable 'dhcp_interface2' from source: play vars 13531 1726882448.28605: variable 'dhcp_interface2' from source: play vars 13531 1726882448.28613: variable 'controller_profile' from source: play vars 13531 1726882448.28681: variable 'controller_profile' from source: play vars 13531 1726882448.28711: variable '__network_packages_default_team' from source: role '' defaults 13531 1726882448.29501: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882448.29831: variable 'network_connections' from source: task vars 13531 1726882448.29834: variable 'controller_profile' from source: play vars 13531 1726882448.29906: variable 'controller_profile' from source: play vars 13531 1726882448.29939: variable 'controller_device' from source: play vars 13531 1726882448.29982: variable 'controller_device' from source: play vars 13531 1726882448.29988: variable 'port1_profile' from source: play vars 13531 1726882448.30060: variable 'port1_profile' from source: play vars 13531 1726882448.30065: variable 'dhcp_interface1' from source: play vars 13531 1726882448.30130: variable 'dhcp_interface1' from source: play vars 13531 1726882448.30133: variable 'controller_profile' from source: play vars 13531 1726882448.30963: variable 'controller_profile' from source: play vars 13531 1726882448.30972: variable 'port2_profile' from source: play vars 13531 1726882448.31047: variable 'port2_profile' from source: play vars 13531 1726882448.31053: variable 'dhcp_interface2' from source: play vars 13531 1726882448.31125: variable 'dhcp_interface2' from source: play vars 13531 1726882448.31131: variable 'controller_profile' from source: play vars 13531 1726882448.31203: variable 'controller_profile' from source: play vars 13531 1726882448.31276: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882448.31335: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882448.31338: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882448.31402: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882448.32440: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13531 1726882448.32969: variable 'network_connections' from source: task vars 13531 1726882448.32973: variable 'controller_profile' from source: play vars 13531 1726882448.33032: variable 'controller_profile' from source: play vars 13531 1726882448.33038: variable 'controller_device' from source: play vars 13531 1726882448.33813: variable 'controller_device' from source: play vars 13531 1726882448.33822: variable 'port1_profile' from source: play vars 13531 1726882448.33893: variable 'port1_profile' from source: play vars 13531 1726882448.33900: variable 'dhcp_interface1' from source: play vars 13531 1726882448.33960: variable 'dhcp_interface1' from source: play vars 13531 1726882448.33968: variable 'controller_profile' from source: play vars 13531 1726882448.34026: variable 'controller_profile' from source: play vars 13531 1726882448.34033: variable 'port2_profile' from source: play vars 13531 1726882448.34093: variable 'port2_profile' from source: play vars 13531 1726882448.34100: variable 'dhcp_interface2' from source: play vars 13531 1726882448.34156: variable 'dhcp_interface2' from source: play vars 13531 1726882448.34167: variable 'controller_profile' from source: play vars 13531 1726882448.34223: variable 'controller_profile' from source: play vars 13531 1726882448.34231: variable 'ansible_distribution' from source: facts 13531 1726882448.34234: variable '__network_rh_distros' from source: role '' defaults 13531 1726882448.34241: variable 'ansible_distribution_major_version' from source: facts 13531 1726882448.34278: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13531 1726882448.34467: variable 'ansible_distribution' from source: facts 13531 1726882448.35054: variable '__network_rh_distros' from source: role '' defaults 13531 1726882448.35069: variable 'ansible_distribution_major_version' from source: facts 13531 1726882448.35087: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13531 1726882448.35514: variable 'ansible_distribution' from source: facts 13531 1726882448.35523: variable '__network_rh_distros' from source: role '' defaults 13531 1726882448.35531: variable 'ansible_distribution_major_version' from source: facts 13531 1726882448.35577: variable 'network_provider' from source: set_fact 13531 1726882448.35611: variable 'omit' from source: magic vars 13531 1726882448.35733: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882448.35767: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882448.35790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882448.35834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882448.35929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882448.35966: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882448.35974: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882448.35982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882448.36206: Set connection var ansible_pipelining to False 13531 1726882448.36249: Set connection var ansible_timeout to 10 13531 1726882448.36259: Set connection var ansible_shell_executable to /bin/sh 13531 1726882448.36270: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882448.36276: Set connection var ansible_connection to ssh 13531 1726882448.36350: Set connection var ansible_shell_type to sh 13531 1726882448.36389: variable 'ansible_shell_executable' from source: unknown 13531 1726882448.36397: variable 'ansible_connection' from source: unknown 13531 1726882448.36403: variable 'ansible_module_compression' from source: unknown 13531 1726882448.36408: variable 'ansible_shell_type' from source: unknown 13531 1726882448.36414: variable 'ansible_shell_executable' from source: unknown 13531 1726882448.36420: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882448.36426: variable 'ansible_pipelining' from source: unknown 13531 1726882448.36432: variable 'ansible_timeout' from source: unknown 13531 1726882448.36438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882448.36556: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882448.36686: variable 'omit' from source: magic vars 13531 1726882448.36792: starting attempt loop 13531 1726882448.36798: running the handler 13531 1726882448.37010: variable 'ansible_facts' from source: unknown 13531 1726882448.38674: _low_level_execute_command(): starting 13531 1726882448.38834: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882448.40690: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882448.40695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882448.40723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882448.40727: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882448.40729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882448.41008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882448.41011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882448.41203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882448.41306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882448.42982: stdout chunk (state=3): >>>/root <<< 13531 1726882448.43077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882448.43159: stderr chunk (state=3): >>><<< 13531 1726882448.43162: stdout chunk (state=3): >>><<< 13531 1726882448.43282: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882448.43291: _low_level_execute_command(): starting 13531 1726882448.43294: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882448.431844-15126-117513696611556 `" && echo ansible-tmp-1726882448.431844-15126-117513696611556="` echo /root/.ansible/tmp/ansible-tmp-1726882448.431844-15126-117513696611556 `" ) && sleep 0' 13531 1726882448.44758: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882448.44763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882448.44800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882448.44804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882448.44806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882448.44869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882448.44882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882448.45504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882448.45601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882448.47525: stdout chunk (state=3): >>>ansible-tmp-1726882448.431844-15126-117513696611556=/root/.ansible/tmp/ansible-tmp-1726882448.431844-15126-117513696611556 <<< 13531 1726882448.47624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882448.47710: stderr chunk (state=3): >>><<< 13531 1726882448.47713: stdout chunk (state=3): >>><<< 13531 1726882448.47973: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882448.431844-15126-117513696611556=/root/.ansible/tmp/ansible-tmp-1726882448.431844-15126-117513696611556 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882448.47977: variable 'ansible_module_compression' from source: unknown 13531 1726882448.47980: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13531 1726882448.47982: variable 'ansible_facts' from source: unknown 13531 1726882448.48112: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882448.431844-15126-117513696611556/AnsiballZ_systemd.py 13531 1726882448.48723: Sending initial data 13531 1726882448.48728: Sent initial data (155 bytes) 13531 1726882448.50833: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882448.50850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882448.50869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882448.50888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882448.50941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882448.50955: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882448.50973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882448.50992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882448.51005: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882448.51017: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882448.51037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882448.51051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882448.51071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882448.51147: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882448.51160: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882448.51177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882448.51371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882448.51396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882448.51413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882448.51548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882448.53345: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882448.53441: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882448.53549: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpqcrq80ch /root/.ansible/tmp/ansible-tmp-1726882448.431844-15126-117513696611556/AnsiballZ_systemd.py <<< 13531 1726882448.53646: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882448.57089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882448.57165: stderr chunk (state=3): >>><<< 13531 1726882448.57168: stdout chunk (state=3): >>><<< 13531 1726882448.57194: done transferring module to remote 13531 1726882448.57202: _low_level_execute_command(): starting 13531 1726882448.57207: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882448.431844-15126-117513696611556/ /root/.ansible/tmp/ansible-tmp-1726882448.431844-15126-117513696611556/AnsiballZ_systemd.py && sleep 0' 13531 1726882448.58843: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882448.59083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882448.59093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882448.59107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882448.59151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882448.59155: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882448.59171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882448.59185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882448.59193: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882448.59199: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882448.59208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882448.59217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882448.59228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882448.59235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882448.59242: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882448.59253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882448.59326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882448.59342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882448.59345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882448.60093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882448.61977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882448.61982: stdout chunk (state=3): >>><<< 13531 1726882448.61985: stderr chunk (state=3): >>><<< 13531 1726882448.62003: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882448.62006: _low_level_execute_command(): starting 13531 1726882448.62011: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882448.431844-15126-117513696611556/AnsiballZ_systemd.py && sleep 0' 13531 1726882448.63534: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882448.63686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882448.63696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882448.63711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882448.64417: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882448.64423: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882448.64434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882448.64448: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882448.64456: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882448.64468: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882448.64476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882448.64486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882448.64499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882448.64505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882448.64511: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882448.64520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882448.64599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882448.64618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882448.64644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882448.64756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882448.90089: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "8982528", "MemoryAvailable": "infinity", "CPUUsageNSec": "1002216000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13531 1726882448.91536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882448.91541: stdout chunk (state=3): >>><<< 13531 1726882448.91545: stderr chunk (state=3): >>><<< 13531 1726882448.91576: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "8982528", "MemoryAvailable": "infinity", "CPUUsageNSec": "1002216000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882448.91758: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882448.431844-15126-117513696611556/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882448.91787: _low_level_execute_command(): starting 13531 1726882448.91790: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882448.431844-15126-117513696611556/ > /dev/null 2>&1 && sleep 0' 13531 1726882448.93142: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882448.93146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882448.93809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882448.93813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882448.93830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882448.93837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882448.93911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882448.93924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882448.93934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882448.94066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882448.95980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882448.95985: stderr chunk (state=3): >>><<< 13531 1726882448.95987: stdout chunk (state=3): >>><<< 13531 1726882448.96004: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882448.96011: handler run complete 13531 1726882448.96081: attempt loop complete, returning result 13531 1726882448.96084: _execute() done 13531 1726882448.96087: dumping result to json 13531 1726882448.96103: done dumping result, returning 13531 1726882448.96113: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4fd9-519d-0000000000db] 13531 1726882448.96119: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000db 13531 1726882448.96388: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000db 13531 1726882448.96390: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882448.96441: no more pending results, returning what we have 13531 1726882448.96445: results queue empty 13531 1726882448.96446: checking for any_errors_fatal 13531 1726882448.96452: done checking for any_errors_fatal 13531 1726882448.96453: checking for max_fail_percentage 13531 1726882448.96455: done checking for max_fail_percentage 13531 1726882448.96456: checking to see if all hosts have failed and the running result is not ok 13531 1726882448.96456: done checking to see if all hosts have failed 13531 1726882448.96457: getting the remaining hosts for this loop 13531 1726882448.96458: done getting the remaining hosts for this loop 13531 1726882448.96461: getting the next task for host managed_node2 13531 1726882448.96470: done getting next task for host managed_node2 13531 1726882448.96473: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13531 1726882448.96476: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882448.96488: getting variables 13531 1726882448.96489: in VariableManager get_vars() 13531 1726882448.96535: Calling all_inventory to load vars for managed_node2 13531 1726882448.96537: Calling groups_inventory to load vars for managed_node2 13531 1726882448.96539: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882448.96549: Calling all_plugins_play to load vars for managed_node2 13531 1726882448.96551: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882448.96553: Calling groups_plugins_play to load vars for managed_node2 13531 1726882448.98835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882449.02462: done with get_vars() 13531 1726882449.02697: done getting variables 13531 1726882449.02766: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:34:09 -0400 (0:00:00.976) 0:00:36.923 ****** 13531 1726882449.02802: entering _queue_task() for managed_node2/service 13531 1726882449.03351: worker is 1 (out of 1 available) 13531 1726882449.03365: exiting _queue_task() for managed_node2/service 13531 1726882449.03378: done queuing things up, now waiting for results queue to drain 13531 1726882449.03379: waiting for pending results... 13531 1726882449.04375: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13531 1726882449.04748: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000dc 13531 1726882449.04772: variable 'ansible_search_path' from source: unknown 13531 1726882449.04780: variable 'ansible_search_path' from source: unknown 13531 1726882449.04820: calling self._execute() 13531 1726882449.04925: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882449.05074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882449.05089: variable 'omit' from source: magic vars 13531 1726882449.05800: variable 'ansible_distribution_major_version' from source: facts 13531 1726882449.05943: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882449.06071: variable 'network_provider' from source: set_fact 13531 1726882449.06151: Evaluated conditional (network_provider == "nm"): True 13531 1726882449.06249: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882449.06449: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882449.06775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882449.12172: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882449.12393: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882449.12550: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882449.12592: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882449.12622: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882449.12710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882449.12894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882449.12926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882449.12976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882449.13098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882449.13149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882449.13180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882449.13214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882449.13337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882449.13357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882449.13445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882449.13538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882449.13572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882449.13662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882449.13684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882449.13976: variable 'network_connections' from source: task vars 13531 1726882449.14070: variable 'controller_profile' from source: play vars 13531 1726882449.14144: variable 'controller_profile' from source: play vars 13531 1726882449.14282: variable 'controller_device' from source: play vars 13531 1726882449.14345: variable 'controller_device' from source: play vars 13531 1726882449.14359: variable 'port1_profile' from source: play vars 13531 1726882449.14423: variable 'port1_profile' from source: play vars 13531 1726882449.14497: variable 'dhcp_interface1' from source: play vars 13531 1726882449.14557: variable 'dhcp_interface1' from source: play vars 13531 1726882449.14714: variable 'controller_profile' from source: play vars 13531 1726882449.14775: variable 'controller_profile' from source: play vars 13531 1726882449.14787: variable 'port2_profile' from source: play vars 13531 1726882449.14961: variable 'port2_profile' from source: play vars 13531 1726882449.14977: variable 'dhcp_interface2' from source: play vars 13531 1726882449.15044: variable 'dhcp_interface2' from source: play vars 13531 1726882449.15146: variable 'controller_profile' from source: play vars 13531 1726882449.15227: variable 'controller_profile' from source: play vars 13531 1726882449.15322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882449.15752: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882449.15828: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882449.16013: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882449.16047: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882449.16099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882449.16130: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882449.16247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882449.16290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882449.16462: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882449.16936: variable 'network_connections' from source: task vars 13531 1726882449.16946: variable 'controller_profile' from source: play vars 13531 1726882449.17108: variable 'controller_profile' from source: play vars 13531 1726882449.17120: variable 'controller_device' from source: play vars 13531 1726882449.17213: variable 'controller_device' from source: play vars 13531 1726882449.17229: variable 'port1_profile' from source: play vars 13531 1726882449.17363: variable 'port1_profile' from source: play vars 13531 1726882449.17376: variable 'dhcp_interface1' from source: play vars 13531 1726882449.17527: variable 'dhcp_interface1' from source: play vars 13531 1726882449.17539: variable 'controller_profile' from source: play vars 13531 1726882449.17601: variable 'controller_profile' from source: play vars 13531 1726882449.17748: variable 'port2_profile' from source: play vars 13531 1726882449.17812: variable 'port2_profile' from source: play vars 13531 1726882449.17823: variable 'dhcp_interface2' from source: play vars 13531 1726882449.17908: variable 'dhcp_interface2' from source: play vars 13531 1726882449.17966: variable 'controller_profile' from source: play vars 13531 1726882449.18124: variable 'controller_profile' from source: play vars 13531 1726882449.18287: Evaluated conditional (__network_wpa_supplicant_required): False 13531 1726882449.18295: when evaluation is False, skipping this task 13531 1726882449.18302: _execute() done 13531 1726882449.18309: dumping result to json 13531 1726882449.18316: done dumping result, returning 13531 1726882449.18328: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4fd9-519d-0000000000dc] 13531 1726882449.18339: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000dc skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13531 1726882449.18491: no more pending results, returning what we have 13531 1726882449.18495: results queue empty 13531 1726882449.18496: checking for any_errors_fatal 13531 1726882449.18517: done checking for any_errors_fatal 13531 1726882449.18519: checking for max_fail_percentage 13531 1726882449.18521: done checking for max_fail_percentage 13531 1726882449.18522: checking to see if all hosts have failed and the running result is not ok 13531 1726882449.18523: done checking to see if all hosts have failed 13531 1726882449.18524: getting the remaining hosts for this loop 13531 1726882449.18525: done getting the remaining hosts for this loop 13531 1726882449.18529: getting the next task for host managed_node2 13531 1726882449.18536: done getting next task for host managed_node2 13531 1726882449.18541: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13531 1726882449.18544: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882449.18566: getting variables 13531 1726882449.18568: in VariableManager get_vars() 13531 1726882449.18627: Calling all_inventory to load vars for managed_node2 13531 1726882449.18630: Calling groups_inventory to load vars for managed_node2 13531 1726882449.18633: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882449.18644: Calling all_plugins_play to load vars for managed_node2 13531 1726882449.18647: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882449.18650: Calling groups_plugins_play to load vars for managed_node2 13531 1726882449.20557: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000dc 13531 1726882449.20562: WORKER PROCESS EXITING 13531 1726882449.22051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882449.26525: done with get_vars() 13531 1726882449.26569: done getting variables 13531 1726882449.26632: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:34:09 -0400 (0:00:00.238) 0:00:37.161 ****** 13531 1726882449.26666: entering _queue_task() for managed_node2/service 13531 1726882449.27598: worker is 1 (out of 1 available) 13531 1726882449.27611: exiting _queue_task() for managed_node2/service 13531 1726882449.27624: done queuing things up, now waiting for results queue to drain 13531 1726882449.27625: waiting for pending results... 13531 1726882449.29007: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 13531 1726882449.29480: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000dd 13531 1726882449.29494: variable 'ansible_search_path' from source: unknown 13531 1726882449.29724: variable 'ansible_search_path' from source: unknown 13531 1726882449.29768: calling self._execute() 13531 1726882449.30098: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882449.30102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882449.30112: variable 'omit' from source: magic vars 13531 1726882449.31397: variable 'ansible_distribution_major_version' from source: facts 13531 1726882449.31408: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882449.31757: variable 'network_provider' from source: set_fact 13531 1726882449.31765: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882449.31886: when evaluation is False, skipping this task 13531 1726882449.31890: _execute() done 13531 1726882449.31893: dumping result to json 13531 1726882449.31896: done dumping result, returning 13531 1726882449.31904: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4fd9-519d-0000000000dd] 13531 1726882449.31912: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000dd 13531 1726882449.32018: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000dd 13531 1726882449.32022: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882449.32069: no more pending results, returning what we have 13531 1726882449.32074: results queue empty 13531 1726882449.32075: checking for any_errors_fatal 13531 1726882449.32082: done checking for any_errors_fatal 13531 1726882449.32083: checking for max_fail_percentage 13531 1726882449.32085: done checking for max_fail_percentage 13531 1726882449.32086: checking to see if all hosts have failed and the running result is not ok 13531 1726882449.32087: done checking to see if all hosts have failed 13531 1726882449.32088: getting the remaining hosts for this loop 13531 1726882449.32090: done getting the remaining hosts for this loop 13531 1726882449.32093: getting the next task for host managed_node2 13531 1726882449.32100: done getting next task for host managed_node2 13531 1726882449.32104: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13531 1726882449.32108: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882449.32132: getting variables 13531 1726882449.32133: in VariableManager get_vars() 13531 1726882449.32193: Calling all_inventory to load vars for managed_node2 13531 1726882449.32195: Calling groups_inventory to load vars for managed_node2 13531 1726882449.32198: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882449.32211: Calling all_plugins_play to load vars for managed_node2 13531 1726882449.32213: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882449.32216: Calling groups_plugins_play to load vars for managed_node2 13531 1726882449.34742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882449.38239: done with get_vars() 13531 1726882449.38267: done getting variables 13531 1726882449.38329: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:34:09 -0400 (0:00:00.116) 0:00:37.278 ****** 13531 1726882449.38365: entering _queue_task() for managed_node2/copy 13531 1726882449.39295: worker is 1 (out of 1 available) 13531 1726882449.39309: exiting _queue_task() for managed_node2/copy 13531 1726882449.39324: done queuing things up, now waiting for results queue to drain 13531 1726882449.39325: waiting for pending results... 13531 1726882449.40172: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13531 1726882449.40460: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000de 13531 1726882449.40549: variable 'ansible_search_path' from source: unknown 13531 1726882449.40557: variable 'ansible_search_path' from source: unknown 13531 1726882449.40602: calling self._execute() 13531 1726882449.40858: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882449.40873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882449.40979: variable 'omit' from source: magic vars 13531 1726882449.41691: variable 'ansible_distribution_major_version' from source: facts 13531 1726882449.41711: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882449.41952: variable 'network_provider' from source: set_fact 13531 1726882449.42057: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882449.42067: when evaluation is False, skipping this task 13531 1726882449.42076: _execute() done 13531 1726882449.42083: dumping result to json 13531 1726882449.42090: done dumping result, returning 13531 1726882449.42102: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4fd9-519d-0000000000de] 13531 1726882449.42113: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000de skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13531 1726882449.42275: no more pending results, returning what we have 13531 1726882449.42280: results queue empty 13531 1726882449.42281: checking for any_errors_fatal 13531 1726882449.42288: done checking for any_errors_fatal 13531 1726882449.42289: checking for max_fail_percentage 13531 1726882449.42291: done checking for max_fail_percentage 13531 1726882449.42292: checking to see if all hosts have failed and the running result is not ok 13531 1726882449.42292: done checking to see if all hosts have failed 13531 1726882449.42293: getting the remaining hosts for this loop 13531 1726882449.42295: done getting the remaining hosts for this loop 13531 1726882449.42298: getting the next task for host managed_node2 13531 1726882449.42305: done getting next task for host managed_node2 13531 1726882449.42309: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13531 1726882449.42313: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882449.42336: getting variables 13531 1726882449.42338: in VariableManager get_vars() 13531 1726882449.42401: Calling all_inventory to load vars for managed_node2 13531 1726882449.42405: Calling groups_inventory to load vars for managed_node2 13531 1726882449.42408: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882449.42421: Calling all_plugins_play to load vars for managed_node2 13531 1726882449.42424: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882449.42427: Calling groups_plugins_play to load vars for managed_node2 13531 1726882449.43974: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000de 13531 1726882449.43978: WORKER PROCESS EXITING 13531 1726882449.46150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882449.51038: done with get_vars() 13531 1726882449.51079: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:34:09 -0400 (0:00:00.128) 0:00:37.408 ****** 13531 1726882449.51376: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 13531 1726882449.52122: worker is 1 (out of 1 available) 13531 1726882449.52134: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 13531 1726882449.52148: done queuing things up, now waiting for results queue to drain 13531 1726882449.52150: waiting for pending results... 13531 1726882449.53717: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13531 1726882449.54110: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000df 13531 1726882449.54203: variable 'ansible_search_path' from source: unknown 13531 1726882449.54244: variable 'ansible_search_path' from source: unknown 13531 1726882449.54325: calling self._execute() 13531 1726882449.54604: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882449.54621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882449.54677: variable 'omit' from source: magic vars 13531 1726882449.55577: variable 'ansible_distribution_major_version' from source: facts 13531 1726882449.55633: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882449.55645: variable 'omit' from source: magic vars 13531 1726882449.55872: variable 'omit' from source: magic vars 13531 1726882449.56332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882449.61719: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882449.61787: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882449.61834: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882449.61883: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882449.61908: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882449.62012: variable 'network_provider' from source: set_fact 13531 1726882449.62459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882449.62812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882449.62838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882449.62881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882449.62895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882449.62975: variable 'omit' from source: magic vars 13531 1726882449.63095: variable 'omit' from source: magic vars 13531 1726882449.63195: variable 'network_connections' from source: task vars 13531 1726882449.63206: variable 'controller_profile' from source: play vars 13531 1726882449.63264: variable 'controller_profile' from source: play vars 13531 1726882449.63580: variable 'controller_device' from source: play vars 13531 1726882449.63639: variable 'controller_device' from source: play vars 13531 1726882449.63794: variable 'port1_profile' from source: play vars 13531 1726882449.63798: variable 'port1_profile' from source: play vars 13531 1726882449.63800: variable 'dhcp_interface1' from source: play vars 13531 1726882449.63802: variable 'dhcp_interface1' from source: play vars 13531 1726882449.63804: variable 'controller_profile' from source: play vars 13531 1726882449.63869: variable 'controller_profile' from source: play vars 13531 1726882449.63873: variable 'port2_profile' from source: play vars 13531 1726882449.64074: variable 'port2_profile' from source: play vars 13531 1726882449.64078: variable 'dhcp_interface2' from source: play vars 13531 1726882449.64080: variable 'dhcp_interface2' from source: play vars 13531 1726882449.64082: variable 'controller_profile' from source: play vars 13531 1726882449.64084: variable 'controller_profile' from source: play vars 13531 1726882449.64227: variable 'omit' from source: magic vars 13531 1726882449.64235: variable '__lsr_ansible_managed' from source: task vars 13531 1726882449.64296: variable '__lsr_ansible_managed' from source: task vars 13531 1726882449.65226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13531 1726882449.65696: Loaded config def from plugin (lookup/template) 13531 1726882449.65700: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13531 1726882449.65727: File lookup term: get_ansible_managed.j2 13531 1726882449.65732: variable 'ansible_search_path' from source: unknown 13531 1726882449.65735: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13531 1726882449.65749: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13531 1726882449.65767: variable 'ansible_search_path' from source: unknown 13531 1726882449.74360: variable 'ansible_managed' from source: unknown 13531 1726882449.74638: variable 'omit' from source: magic vars 13531 1726882449.74688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882449.74721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882449.74738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882449.74768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882449.74792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882449.74831: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882449.74836: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882449.74845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882449.74941: Set connection var ansible_pipelining to False 13531 1726882449.74946: Set connection var ansible_timeout to 10 13531 1726882449.74952: Set connection var ansible_shell_executable to /bin/sh 13531 1726882449.74960: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882449.74975: Set connection var ansible_connection to ssh 13531 1726882449.74978: Set connection var ansible_shell_type to sh 13531 1726882449.75015: variable 'ansible_shell_executable' from source: unknown 13531 1726882449.75019: variable 'ansible_connection' from source: unknown 13531 1726882449.75025: variable 'ansible_module_compression' from source: unknown 13531 1726882449.75037: variable 'ansible_shell_type' from source: unknown 13531 1726882449.75043: variable 'ansible_shell_executable' from source: unknown 13531 1726882449.75046: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882449.75048: variable 'ansible_pipelining' from source: unknown 13531 1726882449.75050: variable 'ansible_timeout' from source: unknown 13531 1726882449.75056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882449.75259: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882449.75285: variable 'omit' from source: magic vars 13531 1726882449.75300: starting attempt loop 13531 1726882449.75303: running the handler 13531 1726882449.75329: _low_level_execute_command(): starting 13531 1726882449.75344: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882449.76309: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882449.76319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882449.76329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882449.76344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882449.76385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882449.76395: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882449.76413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882449.76433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882449.76438: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882449.76445: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882449.76457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882449.76465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882449.76476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882449.76484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882449.76490: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882449.76500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882449.76574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882449.76593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882449.76604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882449.76749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882449.78404: stdout chunk (state=3): >>>/root <<< 13531 1726882449.78591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882449.78595: stdout chunk (state=3): >>><<< 13531 1726882449.78601: stderr chunk (state=3): >>><<< 13531 1726882449.78624: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882449.78638: _low_level_execute_command(): starting 13531 1726882449.78644: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882449.7862437-15163-264002571876624 `" && echo ansible-tmp-1726882449.7862437-15163-264002571876624="` echo /root/.ansible/tmp/ansible-tmp-1726882449.7862437-15163-264002571876624 `" ) && sleep 0' 13531 1726882449.81974: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882449.81979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882449.82060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882449.82119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882449.82122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882449.82138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882449.82143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882449.82322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882449.82325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882449.82350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882449.82494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882449.84440: stdout chunk (state=3): >>>ansible-tmp-1726882449.7862437-15163-264002571876624=/root/.ansible/tmp/ansible-tmp-1726882449.7862437-15163-264002571876624 <<< 13531 1726882449.84615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882449.84619: stderr chunk (state=3): >>><<< 13531 1726882449.84622: stdout chunk (state=3): >>><<< 13531 1726882449.84689: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882449.7862437-15163-264002571876624=/root/.ansible/tmp/ansible-tmp-1726882449.7862437-15163-264002571876624 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882449.84732: variable 'ansible_module_compression' from source: unknown 13531 1726882449.84817: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13531 1726882449.84848: variable 'ansible_facts' from source: unknown 13531 1726882449.85034: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882449.7862437-15163-264002571876624/AnsiballZ_network_connections.py 13531 1726882449.85893: Sending initial data 13531 1726882449.85914: Sent initial data (168 bytes) 13531 1726882449.90300: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882449.90316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882449.90326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882449.90339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882449.90382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882449.90435: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882449.90451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882449.90678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882449.90695: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882449.90702: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882449.90720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882449.90734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882449.90751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882449.90772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882449.90788: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882449.90798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882449.90878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882449.90897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882449.90922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882449.91047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882449.92907: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882449.93025: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882449.93111: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpib72ke3p /root/.ansible/tmp/ansible-tmp-1726882449.7862437-15163-264002571876624/AnsiballZ_network_connections.py <<< 13531 1726882449.93209: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882449.95207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882449.95324: stderr chunk (state=3): >>><<< 13531 1726882449.95328: stdout chunk (state=3): >>><<< 13531 1726882449.95342: done transferring module to remote 13531 1726882449.95352: _low_level_execute_command(): starting 13531 1726882449.95370: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882449.7862437-15163-264002571876624/ /root/.ansible/tmp/ansible-tmp-1726882449.7862437-15163-264002571876624/AnsiballZ_network_connections.py && sleep 0' 13531 1726882449.96077: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882449.96094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882449.96112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882449.96130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882449.96184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882449.96200: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882449.96219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882449.96238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882449.96259: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882449.96277: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882449.96291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882449.96304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882449.96321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882449.96337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882449.96348: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882449.96368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882449.96444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882449.96478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882449.96495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882449.96625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882449.98452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882449.98512: stderr chunk (state=3): >>><<< 13531 1726882449.98534: stdout chunk (state=3): >>><<< 13531 1726882449.98550: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882449.98559: _low_level_execute_command(): starting 13531 1726882449.98562: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882449.7862437-15163-264002571876624/AnsiballZ_network_connections.py && sleep 0' 13531 1726882449.99114: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882449.99122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882449.99130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882449.99140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882449.99195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882449.99199: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882449.99202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882449.99209: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882449.99220: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882449.99235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882449.99238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882449.99290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882449.99293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882449.99303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882449.99426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882450.44409: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 467176f0-8c25-4dd1-9498-f31f30164a10\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 467176f0-8c25-4dd1-9498-f31f30164a10 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13531 1726882450.46472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882450.46546: stderr chunk (state=3): >>><<< 13531 1726882450.46550: stdout chunk (state=3): >>><<< 13531 1726882450.46670: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 467176f0-8c25-4dd1-9498-f31f30164a10\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 467176f0-8c25-4dd1-9498-f31f30164a10 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882450.46675: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882449.7862437-15163-264002571876624/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882450.46683: _low_level_execute_command(): starting 13531 1726882450.46685: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882449.7862437-15163-264002571876624/ > /dev/null 2>&1 && sleep 0' 13531 1726882450.47969: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882450.48685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882450.48700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882450.48717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882450.48766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882450.48777: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882450.48790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882450.48806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882450.48817: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882450.48827: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882450.48838: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882450.48850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882450.48872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882450.48885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882450.48894: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882450.48905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882450.48986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882450.49007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882450.49021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882450.49158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882450.51076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882450.51081: stdout chunk (state=3): >>><<< 13531 1726882450.51084: stderr chunk (state=3): >>><<< 13531 1726882450.51371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882450.51375: handler run complete 13531 1726882450.51378: attempt loop complete, returning result 13531 1726882450.51380: _execute() done 13531 1726882450.51382: dumping result to json 13531 1726882450.51384: done dumping result, returning 13531 1726882450.51386: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4fd9-519d-0000000000df] 13531 1726882450.51388: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000df 13531 1726882450.51484: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000df 13531 1726882450.51488: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 467176f0-8c25-4dd1-9498-f31f30164a10 [008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50 [009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 467176f0-8c25-4dd1-9498-f31f30164a10 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78 (not-active) 13531 1726882450.51636: no more pending results, returning what we have 13531 1726882450.51640: results queue empty 13531 1726882450.51641: checking for any_errors_fatal 13531 1726882450.51648: done checking for any_errors_fatal 13531 1726882450.51649: checking for max_fail_percentage 13531 1726882450.51652: done checking for max_fail_percentage 13531 1726882450.51653: checking to see if all hosts have failed and the running result is not ok 13531 1726882450.51657: done checking to see if all hosts have failed 13531 1726882450.51657: getting the remaining hosts for this loop 13531 1726882450.51659: done getting the remaining hosts for this loop 13531 1726882450.51665: getting the next task for host managed_node2 13531 1726882450.51671: done getting next task for host managed_node2 13531 1726882450.51675: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13531 1726882450.51679: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882450.51696: getting variables 13531 1726882450.51698: in VariableManager get_vars() 13531 1726882450.51759: Calling all_inventory to load vars for managed_node2 13531 1726882450.51761: Calling groups_inventory to load vars for managed_node2 13531 1726882450.51766: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882450.51777: Calling all_plugins_play to load vars for managed_node2 13531 1726882450.51780: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882450.51783: Calling groups_plugins_play to load vars for managed_node2 13531 1726882450.54709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882450.57621: done with get_vars() 13531 1726882450.57657: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:34:10 -0400 (0:00:01.063) 0:00:38.472 ****** 13531 1726882450.57753: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 13531 1726882450.58113: worker is 1 (out of 1 available) 13531 1726882450.58136: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 13531 1726882450.58150: done queuing things up, now waiting for results queue to drain 13531 1726882450.58151: waiting for pending results... 13531 1726882450.58352: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 13531 1726882450.58458: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000e0 13531 1726882450.58469: variable 'ansible_search_path' from source: unknown 13531 1726882450.58479: variable 'ansible_search_path' from source: unknown 13531 1726882450.58517: calling self._execute() 13531 1726882450.58605: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882450.58608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882450.58617: variable 'omit' from source: magic vars 13531 1726882450.58894: variable 'ansible_distribution_major_version' from source: facts 13531 1726882450.58905: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882450.58996: variable 'network_state' from source: role '' defaults 13531 1726882450.59005: Evaluated conditional (network_state != {}): False 13531 1726882450.59010: when evaluation is False, skipping this task 13531 1726882450.59013: _execute() done 13531 1726882450.59016: dumping result to json 13531 1726882450.59018: done dumping result, returning 13531 1726882450.59023: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4fd9-519d-0000000000e0] 13531 1726882450.59034: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000e0 13531 1726882450.59137: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000e0 13531 1726882450.59140: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882450.59200: no more pending results, returning what we have 13531 1726882450.59204: results queue empty 13531 1726882450.59205: checking for any_errors_fatal 13531 1726882450.59237: done checking for any_errors_fatal 13531 1726882450.59241: checking for max_fail_percentage 13531 1726882450.59275: done checking for max_fail_percentage 13531 1726882450.59276: checking to see if all hosts have failed and the running result is not ok 13531 1726882450.59277: done checking to see if all hosts have failed 13531 1726882450.59278: getting the remaining hosts for this loop 13531 1726882450.59279: done getting the remaining hosts for this loop 13531 1726882450.59283: getting the next task for host managed_node2 13531 1726882450.59289: done getting next task for host managed_node2 13531 1726882450.59293: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13531 1726882450.59296: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882450.59317: getting variables 13531 1726882450.59319: in VariableManager get_vars() 13531 1726882450.59391: Calling all_inventory to load vars for managed_node2 13531 1726882450.59395: Calling groups_inventory to load vars for managed_node2 13531 1726882450.59397: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882450.59409: Calling all_plugins_play to load vars for managed_node2 13531 1726882450.59413: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882450.59416: Calling groups_plugins_play to load vars for managed_node2 13531 1726882450.61803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882450.63338: done with get_vars() 13531 1726882450.63369: done getting variables 13531 1726882450.63415: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:34:10 -0400 (0:00:00.056) 0:00:38.529 ****** 13531 1726882450.63441: entering _queue_task() for managed_node2/debug 13531 1726882450.63727: worker is 1 (out of 1 available) 13531 1726882450.63742: exiting _queue_task() for managed_node2/debug 13531 1726882450.63776: done queuing things up, now waiting for results queue to drain 13531 1726882450.63777: waiting for pending results... 13531 1726882450.64024: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13531 1726882450.64394: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000e1 13531 1726882450.64399: variable 'ansible_search_path' from source: unknown 13531 1726882450.64402: variable 'ansible_search_path' from source: unknown 13531 1726882450.64404: calling self._execute() 13531 1726882450.64407: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882450.64410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882450.64413: variable 'omit' from source: magic vars 13531 1726882450.65288: variable 'ansible_distribution_major_version' from source: facts 13531 1726882450.65292: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882450.65295: variable 'omit' from source: magic vars 13531 1726882450.65298: variable 'omit' from source: magic vars 13531 1726882450.65300: variable 'omit' from source: magic vars 13531 1726882450.65302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882450.65304: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882450.65307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882450.65309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882450.65310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882450.65313: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882450.65315: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882450.65317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882450.65319: Set connection var ansible_pipelining to False 13531 1726882450.65321: Set connection var ansible_timeout to 10 13531 1726882450.65323: Set connection var ansible_shell_executable to /bin/sh 13531 1726882450.65325: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882450.65327: Set connection var ansible_connection to ssh 13531 1726882450.65329: Set connection var ansible_shell_type to sh 13531 1726882450.65331: variable 'ansible_shell_executable' from source: unknown 13531 1726882450.65333: variable 'ansible_connection' from source: unknown 13531 1726882450.65335: variable 'ansible_module_compression' from source: unknown 13531 1726882450.65337: variable 'ansible_shell_type' from source: unknown 13531 1726882450.65339: variable 'ansible_shell_executable' from source: unknown 13531 1726882450.65341: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882450.65343: variable 'ansible_pipelining' from source: unknown 13531 1726882450.65344: variable 'ansible_timeout' from source: unknown 13531 1726882450.65346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882450.65349: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882450.65352: variable 'omit' from source: magic vars 13531 1726882450.65353: starting attempt loop 13531 1726882450.65356: running the handler 13531 1726882450.65469: variable '__network_connections_result' from source: set_fact 13531 1726882450.65687: handler run complete 13531 1726882450.65690: attempt loop complete, returning result 13531 1726882450.65692: _execute() done 13531 1726882450.65694: dumping result to json 13531 1726882450.65696: done dumping result, returning 13531 1726882450.65698: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4fd9-519d-0000000000e1] 13531 1726882450.65699: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000e1 13531 1726882450.65769: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000e1 13531 1726882450.65772: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 467176f0-8c25-4dd1-9498-f31f30164a10", "[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50", "[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 467176f0-8c25-4dd1-9498-f31f30164a10 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78 (not-active)" ] } 13531 1726882450.65838: no more pending results, returning what we have 13531 1726882450.65841: results queue empty 13531 1726882450.65842: checking for any_errors_fatal 13531 1726882450.65848: done checking for any_errors_fatal 13531 1726882450.65849: checking for max_fail_percentage 13531 1726882450.65850: done checking for max_fail_percentage 13531 1726882450.65851: checking to see if all hosts have failed and the running result is not ok 13531 1726882450.65852: done checking to see if all hosts have failed 13531 1726882450.65852: getting the remaining hosts for this loop 13531 1726882450.65854: done getting the remaining hosts for this loop 13531 1726882450.65857: getting the next task for host managed_node2 13531 1726882450.65862: done getting next task for host managed_node2 13531 1726882450.65868: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13531 1726882450.65871: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882450.66500: getting variables 13531 1726882450.66503: in VariableManager get_vars() 13531 1726882450.66557: Calling all_inventory to load vars for managed_node2 13531 1726882450.66560: Calling groups_inventory to load vars for managed_node2 13531 1726882450.66566: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882450.66585: Calling all_plugins_play to load vars for managed_node2 13531 1726882450.66589: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882450.66592: Calling groups_plugins_play to load vars for managed_node2 13531 1726882450.68071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882450.69324: done with get_vars() 13531 1726882450.69356: done getting variables 13531 1726882450.69402: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:34:10 -0400 (0:00:00.059) 0:00:38.589 ****** 13531 1726882450.69432: entering _queue_task() for managed_node2/debug 13531 1726882450.69687: worker is 1 (out of 1 available) 13531 1726882450.69699: exiting _queue_task() for managed_node2/debug 13531 1726882450.69712: done queuing things up, now waiting for results queue to drain 13531 1726882450.69714: waiting for pending results... 13531 1726882450.69906: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13531 1726882450.70000: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000e2 13531 1726882450.70011: variable 'ansible_search_path' from source: unknown 13531 1726882450.70015: variable 'ansible_search_path' from source: unknown 13531 1726882450.70047: calling self._execute() 13531 1726882450.70124: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882450.70127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882450.70136: variable 'omit' from source: magic vars 13531 1726882450.70417: variable 'ansible_distribution_major_version' from source: facts 13531 1726882450.70429: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882450.70436: variable 'omit' from source: magic vars 13531 1726882450.70481: variable 'omit' from source: magic vars 13531 1726882450.70507: variable 'omit' from source: magic vars 13531 1726882450.70544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882450.70572: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882450.70587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882450.70602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882450.70612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882450.70638: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882450.70641: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882450.70644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882450.70713: Set connection var ansible_pipelining to False 13531 1726882450.70717: Set connection var ansible_timeout to 10 13531 1726882450.70723: Set connection var ansible_shell_executable to /bin/sh 13531 1726882450.70727: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882450.70730: Set connection var ansible_connection to ssh 13531 1726882450.70733: Set connection var ansible_shell_type to sh 13531 1726882450.70760: variable 'ansible_shell_executable' from source: unknown 13531 1726882450.70764: variable 'ansible_connection' from source: unknown 13531 1726882450.70768: variable 'ansible_module_compression' from source: unknown 13531 1726882450.70770: variable 'ansible_shell_type' from source: unknown 13531 1726882450.70772: variable 'ansible_shell_executable' from source: unknown 13531 1726882450.70774: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882450.70776: variable 'ansible_pipelining' from source: unknown 13531 1726882450.70778: variable 'ansible_timeout' from source: unknown 13531 1726882450.70780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882450.70880: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882450.70889: variable 'omit' from source: magic vars 13531 1726882450.70895: starting attempt loop 13531 1726882450.70898: running the handler 13531 1726882450.70938: variable '__network_connections_result' from source: set_fact 13531 1726882450.71000: variable '__network_connections_result' from source: set_fact 13531 1726882450.71123: handler run complete 13531 1726882450.71144: attempt loop complete, returning result 13531 1726882450.71147: _execute() done 13531 1726882450.71149: dumping result to json 13531 1726882450.71156: done dumping result, returning 13531 1726882450.71165: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4fd9-519d-0000000000e2] 13531 1726882450.71215: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000e2 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 467176f0-8c25-4dd1-9498-f31f30164a10\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 467176f0-8c25-4dd1-9498-f31f30164a10 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 467176f0-8c25-4dd1-9498-f31f30164a10", "[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50", "[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 467176f0-8c25-4dd1-9498-f31f30164a10 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 207956e5-5781-4b3b-a739-18944f85bf50 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5dda56b8-72f2-4584-944c-6391c4eeec78 (not-active)" ] } } 13531 1726882450.71561: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000e2 13531 1726882450.71567: WORKER PROCESS EXITING 13531 1726882450.71583: no more pending results, returning what we have 13531 1726882450.71587: results queue empty 13531 1726882450.71594: checking for any_errors_fatal 13531 1726882450.71601: done checking for any_errors_fatal 13531 1726882450.71602: checking for max_fail_percentage 13531 1726882450.71604: done checking for max_fail_percentage 13531 1726882450.71605: checking to see if all hosts have failed and the running result is not ok 13531 1726882450.71606: done checking to see if all hosts have failed 13531 1726882450.71607: getting the remaining hosts for this loop 13531 1726882450.71648: done getting the remaining hosts for this loop 13531 1726882450.71653: getting the next task for host managed_node2 13531 1726882450.71662: done getting next task for host managed_node2 13531 1726882450.71668: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13531 1726882450.71672: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882450.71686: getting variables 13531 1726882450.71688: in VariableManager get_vars() 13531 1726882450.71858: Calling all_inventory to load vars for managed_node2 13531 1726882450.71940: Calling groups_inventory to load vars for managed_node2 13531 1726882450.71944: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882450.71957: Calling all_plugins_play to load vars for managed_node2 13531 1726882450.71960: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882450.71963: Calling groups_plugins_play to load vars for managed_node2 13531 1726882450.73167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882450.74122: done with get_vars() 13531 1726882450.74143: done getting variables 13531 1726882450.74193: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:34:10 -0400 (0:00:00.047) 0:00:38.637 ****** 13531 1726882450.74219: entering _queue_task() for managed_node2/debug 13531 1726882450.74476: worker is 1 (out of 1 available) 13531 1726882450.74488: exiting _queue_task() for managed_node2/debug 13531 1726882450.74502: done queuing things up, now waiting for results queue to drain 13531 1726882450.74503: waiting for pending results... 13531 1726882450.74700: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13531 1726882450.74795: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000e3 13531 1726882450.74807: variable 'ansible_search_path' from source: unknown 13531 1726882450.74812: variable 'ansible_search_path' from source: unknown 13531 1726882450.74843: calling self._execute() 13531 1726882450.74923: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882450.74927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882450.74934: variable 'omit' from source: magic vars 13531 1726882450.75214: variable 'ansible_distribution_major_version' from source: facts 13531 1726882450.75224: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882450.75313: variable 'network_state' from source: role '' defaults 13531 1726882450.75322: Evaluated conditional (network_state != {}): False 13531 1726882450.75325: when evaluation is False, skipping this task 13531 1726882450.75327: _execute() done 13531 1726882450.75330: dumping result to json 13531 1726882450.75339: done dumping result, returning 13531 1726882450.75358: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4fd9-519d-0000000000e3] 13531 1726882450.75382: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000e3 13531 1726882450.75517: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000e3 13531 1726882450.75522: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 13531 1726882450.75587: no more pending results, returning what we have 13531 1726882450.75595: results queue empty 13531 1726882450.75596: checking for any_errors_fatal 13531 1726882450.75608: done checking for any_errors_fatal 13531 1726882450.75609: checking for max_fail_percentage 13531 1726882450.75611: done checking for max_fail_percentage 13531 1726882450.75613: checking to see if all hosts have failed and the running result is not ok 13531 1726882450.75614: done checking to see if all hosts have failed 13531 1726882450.75614: getting the remaining hosts for this loop 13531 1726882450.75616: done getting the remaining hosts for this loop 13531 1726882450.75619: getting the next task for host managed_node2 13531 1726882450.75627: done getting next task for host managed_node2 13531 1726882450.75643: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13531 1726882450.75650: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882450.75683: getting variables 13531 1726882450.75685: in VariableManager get_vars() 13531 1726882450.75775: Calling all_inventory to load vars for managed_node2 13531 1726882450.75778: Calling groups_inventory to load vars for managed_node2 13531 1726882450.75781: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882450.75794: Calling all_plugins_play to load vars for managed_node2 13531 1726882450.75796: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882450.75798: Calling groups_plugins_play to load vars for managed_node2 13531 1726882450.77475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882450.78802: done with get_vars() 13531 1726882450.78827: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:34:10 -0400 (0:00:00.047) 0:00:38.684 ****** 13531 1726882450.78948: entering _queue_task() for managed_node2/ping 13531 1726882450.79399: worker is 1 (out of 1 available) 13531 1726882450.79412: exiting _queue_task() for managed_node2/ping 13531 1726882450.79427: done queuing things up, now waiting for results queue to drain 13531 1726882450.79428: waiting for pending results... 13531 1726882450.79777: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 13531 1726882450.80002: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000000e4 13531 1726882450.80007: variable 'ansible_search_path' from source: unknown 13531 1726882450.80011: variable 'ansible_search_path' from source: unknown 13531 1726882450.80068: calling self._execute() 13531 1726882450.80210: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882450.80214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882450.80217: variable 'omit' from source: magic vars 13531 1726882450.80653: variable 'ansible_distribution_major_version' from source: facts 13531 1726882450.80667: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882450.80677: variable 'omit' from source: magic vars 13531 1726882450.80724: variable 'omit' from source: magic vars 13531 1726882450.80748: variable 'omit' from source: magic vars 13531 1726882450.80802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882450.80832: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882450.80861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882450.80890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882450.80919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882450.80947: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882450.80950: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882450.80953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882450.81072: Set connection var ansible_pipelining to False 13531 1726882450.81089: Set connection var ansible_timeout to 10 13531 1726882450.81092: Set connection var ansible_shell_executable to /bin/sh 13531 1726882450.81108: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882450.81111: Set connection var ansible_connection to ssh 13531 1726882450.81113: Set connection var ansible_shell_type to sh 13531 1726882450.81141: variable 'ansible_shell_executable' from source: unknown 13531 1726882450.81144: variable 'ansible_connection' from source: unknown 13531 1726882450.81147: variable 'ansible_module_compression' from source: unknown 13531 1726882450.81168: variable 'ansible_shell_type' from source: unknown 13531 1726882450.81172: variable 'ansible_shell_executable' from source: unknown 13531 1726882450.81174: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882450.81176: variable 'ansible_pipelining' from source: unknown 13531 1726882450.81178: variable 'ansible_timeout' from source: unknown 13531 1726882450.81180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882450.81436: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882450.81444: variable 'omit' from source: magic vars 13531 1726882450.81449: starting attempt loop 13531 1726882450.81452: running the handler 13531 1726882450.81481: _low_level_execute_command(): starting 13531 1726882450.81510: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882450.82480: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882450.82484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882450.82486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882450.82489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882450.82491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882450.82539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882450.82568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882450.82591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882450.82727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882450.84417: stdout chunk (state=3): >>>/root <<< 13531 1726882450.84506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882450.84572: stderr chunk (state=3): >>><<< 13531 1726882450.84575: stdout chunk (state=3): >>><<< 13531 1726882450.84601: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882450.84615: _low_level_execute_command(): starting 13531 1726882450.84621: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882450.846011-15219-211246338462644 `" && echo ansible-tmp-1726882450.846011-15219-211246338462644="` echo /root/.ansible/tmp/ansible-tmp-1726882450.846011-15219-211246338462644 `" ) && sleep 0' 13531 1726882450.85109: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882450.85114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882450.85126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882450.85137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882450.85173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882450.85186: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882450.85189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882450.85197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882450.85206: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882450.85211: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882450.85216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882450.85237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882450.85239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882450.85290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882450.85295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882450.85314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882450.85427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882450.87334: stdout chunk (state=3): >>>ansible-tmp-1726882450.846011-15219-211246338462644=/root/.ansible/tmp/ansible-tmp-1726882450.846011-15219-211246338462644 <<< 13531 1726882450.87442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882450.87510: stderr chunk (state=3): >>><<< 13531 1726882450.87514: stdout chunk (state=3): >>><<< 13531 1726882450.87531: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882450.846011-15219-211246338462644=/root/.ansible/tmp/ansible-tmp-1726882450.846011-15219-211246338462644 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882450.87574: variable 'ansible_module_compression' from source: unknown 13531 1726882450.87609: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13531 1726882450.87641: variable 'ansible_facts' from source: unknown 13531 1726882450.87698: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882450.846011-15219-211246338462644/AnsiballZ_ping.py 13531 1726882450.87809: Sending initial data 13531 1726882450.87812: Sent initial data (152 bytes) 13531 1726882450.88935: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882450.88940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882450.88971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882450.88975: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882450.88977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882450.89057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882450.89075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882450.89181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882450.90949: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882450.91031: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882450.91138: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpyy0g3ob6 /root/.ansible/tmp/ansible-tmp-1726882450.846011-15219-211246338462644/AnsiballZ_ping.py <<< 13531 1726882450.91230: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882450.92347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882450.92470: stderr chunk (state=3): >>><<< 13531 1726882450.92473: stdout chunk (state=3): >>><<< 13531 1726882450.92486: done transferring module to remote 13531 1726882450.92499: _low_level_execute_command(): starting 13531 1726882450.92503: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882450.846011-15219-211246338462644/ /root/.ansible/tmp/ansible-tmp-1726882450.846011-15219-211246338462644/AnsiballZ_ping.py && sleep 0' 13531 1726882450.92971: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882450.92977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882450.93026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882450.93030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882450.93032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882450.93038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882450.93100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882450.93126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882450.93249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882450.95092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882450.95210: stderr chunk (state=3): >>><<< 13531 1726882450.95216: stdout chunk (state=3): >>><<< 13531 1726882450.95237: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882450.95241: _low_level_execute_command(): starting 13531 1726882450.95245: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882450.846011-15219-211246338462644/AnsiballZ_ping.py && sleep 0' 13531 1726882450.95985: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882450.95993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882450.96060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882450.96065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882450.96097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882450.96117: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882450.96139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882450.96167: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882450.96181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882450.96192: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882450.96214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882450.96229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882450.96245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882450.96257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882450.96272: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882450.96294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882450.96386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882450.96403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882450.96417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882450.96600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882451.09703: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13531 1726882451.10696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882451.10786: stderr chunk (state=3): >>><<< 13531 1726882451.10789: stdout chunk (state=3): >>><<< 13531 1726882451.10925: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882451.10930: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882450.846011-15219-211246338462644/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882451.10936: _low_level_execute_command(): starting 13531 1726882451.10938: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882450.846011-15219-211246338462644/ > /dev/null 2>&1 && sleep 0' 13531 1726882451.11568: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882451.11585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882451.11601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882451.11621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882451.11671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882451.11685: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882451.11700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882451.11718: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882451.11730: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882451.11741: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882451.11766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882451.11781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882451.11797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882451.11810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882451.11822: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882451.11836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882451.11917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882451.11935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882451.11949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882451.12088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882451.13940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882451.14187: stderr chunk (state=3): >>><<< 13531 1726882451.14191: stdout chunk (state=3): >>><<< 13531 1726882451.14704: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882451.14709: handler run complete 13531 1726882451.14715: attempt loop complete, returning result 13531 1726882451.14717: _execute() done 13531 1726882451.14720: dumping result to json 13531 1726882451.14722: done dumping result, returning 13531 1726882451.14724: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4fd9-519d-0000000000e4] 13531 1726882451.14726: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000e4 13531 1726882451.14849: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000000e4 13531 1726882451.14855: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 13531 1726882451.14961: no more pending results, returning what we have 13531 1726882451.14969: results queue empty 13531 1726882451.14973: checking for any_errors_fatal 13531 1726882451.14982: done checking for any_errors_fatal 13531 1726882451.14983: checking for max_fail_percentage 13531 1726882451.14985: done checking for max_fail_percentage 13531 1726882451.14991: checking to see if all hosts have failed and the running result is not ok 13531 1726882451.14994: done checking to see if all hosts have failed 13531 1726882451.14994: getting the remaining hosts for this loop 13531 1726882451.14996: done getting the remaining hosts for this loop 13531 1726882451.15002: getting the next task for host managed_node2 13531 1726882451.15011: done getting next task for host managed_node2 13531 1726882451.15016: ^ task is: TASK: meta (role_complete) 13531 1726882451.15023: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882451.15046: getting variables 13531 1726882451.15048: in VariableManager get_vars() 13531 1726882451.15108: Calling all_inventory to load vars for managed_node2 13531 1726882451.15111: Calling groups_inventory to load vars for managed_node2 13531 1726882451.15114: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882451.15123: Calling all_plugins_play to load vars for managed_node2 13531 1726882451.15126: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882451.15129: Calling groups_plugins_play to load vars for managed_node2 13531 1726882451.16787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882451.19299: done with get_vars() 13531 1726882451.19334: done getting variables 13531 1726882451.19425: done queuing things up, now waiting for results queue to drain 13531 1726882451.19427: results queue empty 13531 1726882451.19428: checking for any_errors_fatal 13531 1726882451.19431: done checking for any_errors_fatal 13531 1726882451.19432: checking for max_fail_percentage 13531 1726882451.19433: done checking for max_fail_percentage 13531 1726882451.19434: checking to see if all hosts have failed and the running result is not ok 13531 1726882451.19435: done checking to see if all hosts have failed 13531 1726882451.19436: getting the remaining hosts for this loop 13531 1726882451.19437: done getting the remaining hosts for this loop 13531 1726882451.19439: getting the next task for host managed_node2 13531 1726882451.19445: done getting next task for host managed_node2 13531 1726882451.19449: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13531 1726882451.19451: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882451.19467: getting variables 13531 1726882451.19469: in VariableManager get_vars() 13531 1726882451.19494: Calling all_inventory to load vars for managed_node2 13531 1726882451.19496: Calling groups_inventory to load vars for managed_node2 13531 1726882451.19498: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882451.19503: Calling all_plugins_play to load vars for managed_node2 13531 1726882451.19506: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882451.19509: Calling groups_plugins_play to load vars for managed_node2 13531 1726882451.20877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882451.22581: done with get_vars() 13531 1726882451.22616: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:34:11 -0400 (0:00:00.437) 0:00:39.122 ****** 13531 1726882451.22710: entering _queue_task() for managed_node2/include_tasks 13531 1726882451.23051: worker is 1 (out of 1 available) 13531 1726882451.23070: exiting _queue_task() for managed_node2/include_tasks 13531 1726882451.23085: done queuing things up, now waiting for results queue to drain 13531 1726882451.23086: waiting for pending results... 13531 1726882451.23720: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13531 1726882451.24092: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000011b 13531 1726882451.24113: variable 'ansible_search_path' from source: unknown 13531 1726882451.24122: variable 'ansible_search_path' from source: unknown 13531 1726882451.24169: calling self._execute() 13531 1726882451.24272: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882451.24405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882451.24419: variable 'omit' from source: magic vars 13531 1726882451.25143: variable 'ansible_distribution_major_version' from source: facts 13531 1726882451.25284: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882451.25295: _execute() done 13531 1726882451.25302: dumping result to json 13531 1726882451.25309: done dumping result, returning 13531 1726882451.25319: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4fd9-519d-00000000011b] 13531 1726882451.25330: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000011b 13531 1726882451.25480: no more pending results, returning what we have 13531 1726882451.25485: in VariableManager get_vars() 13531 1726882451.25552: Calling all_inventory to load vars for managed_node2 13531 1726882451.25558: Calling groups_inventory to load vars for managed_node2 13531 1726882451.25561: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882451.25577: Calling all_plugins_play to load vars for managed_node2 13531 1726882451.25581: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882451.25584: Calling groups_plugins_play to load vars for managed_node2 13531 1726882451.26857: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000011b 13531 1726882451.26861: WORKER PROCESS EXITING 13531 1726882451.27953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882451.29900: done with get_vars() 13531 1726882451.29932: variable 'ansible_search_path' from source: unknown 13531 1726882451.29933: variable 'ansible_search_path' from source: unknown 13531 1726882451.29981: we have included files to process 13531 1726882451.29982: generating all_blocks data 13531 1726882451.29985: done generating all_blocks data 13531 1726882451.29992: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882451.29993: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882451.29995: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882451.30598: done processing included file 13531 1726882451.30601: iterating over new_blocks loaded from include file 13531 1726882451.30603: in VariableManager get_vars() 13531 1726882451.30638: done with get_vars() 13531 1726882451.30640: filtering new block on tags 13531 1726882451.30662: done filtering new block on tags 13531 1726882451.30667: in VariableManager get_vars() 13531 1726882451.30699: done with get_vars() 13531 1726882451.30701: filtering new block on tags 13531 1726882451.30722: done filtering new block on tags 13531 1726882451.30725: in VariableManager get_vars() 13531 1726882451.30759: done with get_vars() 13531 1726882451.30761: filtering new block on tags 13531 1726882451.30782: done filtering new block on tags 13531 1726882451.30785: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 13531 1726882451.30790: extending task lists for all hosts with included blocks 13531 1726882451.32087: done extending task lists 13531 1726882451.32089: done processing included files 13531 1726882451.32089: results queue empty 13531 1726882451.32090: checking for any_errors_fatal 13531 1726882451.32091: done checking for any_errors_fatal 13531 1726882451.32091: checking for max_fail_percentage 13531 1726882451.32092: done checking for max_fail_percentage 13531 1726882451.32093: checking to see if all hosts have failed and the running result is not ok 13531 1726882451.32094: done checking to see if all hosts have failed 13531 1726882451.32094: getting the remaining hosts for this loop 13531 1726882451.32095: done getting the remaining hosts for this loop 13531 1726882451.32097: getting the next task for host managed_node2 13531 1726882451.32099: done getting next task for host managed_node2 13531 1726882451.32101: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13531 1726882451.32104: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882451.32112: getting variables 13531 1726882451.32113: in VariableManager get_vars() 13531 1726882451.32143: Calling all_inventory to load vars for managed_node2 13531 1726882451.32146: Calling groups_inventory to load vars for managed_node2 13531 1726882451.32149: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882451.32157: Calling all_plugins_play to load vars for managed_node2 13531 1726882451.32160: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882451.32162: Calling groups_plugins_play to load vars for managed_node2 13531 1726882451.32990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882451.39811: done with get_vars() 13531 1726882451.39843: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:34:11 -0400 (0:00:00.172) 0:00:39.294 ****** 13531 1726882451.39931: entering _queue_task() for managed_node2/setup 13531 1726882451.40356: worker is 1 (out of 1 available) 13531 1726882451.40370: exiting _queue_task() for managed_node2/setup 13531 1726882451.40384: done queuing things up, now waiting for results queue to drain 13531 1726882451.40386: waiting for pending results... 13531 1726882451.40699: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13531 1726882451.40873: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000084f 13531 1726882451.40888: variable 'ansible_search_path' from source: unknown 13531 1726882451.40891: variable 'ansible_search_path' from source: unknown 13531 1726882451.40928: calling self._execute() 13531 1726882451.41043: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882451.41056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882451.41070: variable 'omit' from source: magic vars 13531 1726882451.41454: variable 'ansible_distribution_major_version' from source: facts 13531 1726882451.41471: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882451.41839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882451.44797: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882451.44993: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882451.45032: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882451.45073: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882451.45254: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882451.45359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882451.45391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882451.45426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882451.45477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882451.45491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882451.45549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882451.45576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882451.45598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882451.45639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882451.45653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882451.45819: variable '__network_required_facts' from source: role '' defaults 13531 1726882451.45828: variable 'ansible_facts' from source: unknown 13531 1726882451.46681: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13531 1726882451.46685: when evaluation is False, skipping this task 13531 1726882451.46688: _execute() done 13531 1726882451.46690: dumping result to json 13531 1726882451.46692: done dumping result, returning 13531 1726882451.46695: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4fd9-519d-00000000084f] 13531 1726882451.46703: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000084f 13531 1726882451.46809: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000084f 13531 1726882451.46811: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882451.46862: no more pending results, returning what we have 13531 1726882451.46868: results queue empty 13531 1726882451.46869: checking for any_errors_fatal 13531 1726882451.46871: done checking for any_errors_fatal 13531 1726882451.46871: checking for max_fail_percentage 13531 1726882451.46873: done checking for max_fail_percentage 13531 1726882451.46874: checking to see if all hosts have failed and the running result is not ok 13531 1726882451.46875: done checking to see if all hosts have failed 13531 1726882451.46876: getting the remaining hosts for this loop 13531 1726882451.46877: done getting the remaining hosts for this loop 13531 1726882451.46881: getting the next task for host managed_node2 13531 1726882451.46897: done getting next task for host managed_node2 13531 1726882451.46902: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13531 1726882451.46906: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882451.46926: getting variables 13531 1726882451.46928: in VariableManager get_vars() 13531 1726882451.46992: Calling all_inventory to load vars for managed_node2 13531 1726882451.46995: Calling groups_inventory to load vars for managed_node2 13531 1726882451.46998: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882451.47010: Calling all_plugins_play to load vars for managed_node2 13531 1726882451.47013: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882451.47016: Calling groups_plugins_play to load vars for managed_node2 13531 1726882451.48887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882451.51192: done with get_vars() 13531 1726882451.51225: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:34:11 -0400 (0:00:00.114) 0:00:39.408 ****** 13531 1726882451.51341: entering _queue_task() for managed_node2/stat 13531 1726882451.51679: worker is 1 (out of 1 available) 13531 1726882451.51692: exiting _queue_task() for managed_node2/stat 13531 1726882451.51705: done queuing things up, now waiting for results queue to drain 13531 1726882451.51706: waiting for pending results... 13531 1726882451.52012: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 13531 1726882451.52187: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000851 13531 1726882451.52208: variable 'ansible_search_path' from source: unknown 13531 1726882451.52215: variable 'ansible_search_path' from source: unknown 13531 1726882451.52267: calling self._execute() 13531 1726882451.52371: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882451.52383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882451.52397: variable 'omit' from source: magic vars 13531 1726882451.52787: variable 'ansible_distribution_major_version' from source: facts 13531 1726882451.52806: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882451.52980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882451.53269: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882451.53318: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882451.53386: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882451.53426: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882451.53518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882451.53547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882451.53585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882451.53616: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882451.53718: variable '__network_is_ostree' from source: set_fact 13531 1726882451.53730: Evaluated conditional (not __network_is_ostree is defined): False 13531 1726882451.53737: when evaluation is False, skipping this task 13531 1726882451.53743: _execute() done 13531 1726882451.53749: dumping result to json 13531 1726882451.53758: done dumping result, returning 13531 1726882451.53772: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4fd9-519d-000000000851] 13531 1726882451.53789: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000851 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13531 1726882451.53944: no more pending results, returning what we have 13531 1726882451.53949: results queue empty 13531 1726882451.53950: checking for any_errors_fatal 13531 1726882451.53959: done checking for any_errors_fatal 13531 1726882451.53960: checking for max_fail_percentage 13531 1726882451.53962: done checking for max_fail_percentage 13531 1726882451.53966: checking to see if all hosts have failed and the running result is not ok 13531 1726882451.53967: done checking to see if all hosts have failed 13531 1726882451.53967: getting the remaining hosts for this loop 13531 1726882451.53969: done getting the remaining hosts for this loop 13531 1726882451.53973: getting the next task for host managed_node2 13531 1726882451.53980: done getting next task for host managed_node2 13531 1726882451.53985: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13531 1726882451.53989: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882451.54012: getting variables 13531 1726882451.54014: in VariableManager get_vars() 13531 1726882451.54076: Calling all_inventory to load vars for managed_node2 13531 1726882451.54079: Calling groups_inventory to load vars for managed_node2 13531 1726882451.54082: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882451.54093: Calling all_plugins_play to load vars for managed_node2 13531 1726882451.54096: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882451.54099: Calling groups_plugins_play to load vars for managed_node2 13531 1726882451.55086: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000851 13531 1726882451.55091: WORKER PROCESS EXITING 13531 1726882451.56050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882451.58475: done with get_vars() 13531 1726882451.58509: done getting variables 13531 1726882451.58689: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:34:11 -0400 (0:00:00.073) 0:00:39.482 ****** 13531 1726882451.58730: entering _queue_task() for managed_node2/set_fact 13531 1726882451.59350: worker is 1 (out of 1 available) 13531 1726882451.59368: exiting _queue_task() for managed_node2/set_fact 13531 1726882451.59380: done queuing things up, now waiting for results queue to drain 13531 1726882451.59382: waiting for pending results... 13531 1726882451.60472: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13531 1726882451.60895: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000852 13531 1726882451.61043: variable 'ansible_search_path' from source: unknown 13531 1726882451.61048: variable 'ansible_search_path' from source: unknown 13531 1726882451.61092: calling self._execute() 13531 1726882451.61506: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882451.61510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882451.61522: variable 'omit' from source: magic vars 13531 1726882451.61981: variable 'ansible_distribution_major_version' from source: facts 13531 1726882451.61993: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882451.62208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882451.62511: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882451.62560: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882451.62649: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882451.62816: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882451.62944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882451.62971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882451.63001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882451.63119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882451.63228: variable '__network_is_ostree' from source: set_fact 13531 1726882451.63241: Evaluated conditional (not __network_is_ostree is defined): False 13531 1726882451.63248: when evaluation is False, skipping this task 13531 1726882451.63259: _execute() done 13531 1726882451.63268: dumping result to json 13531 1726882451.63276: done dumping result, returning 13531 1726882451.63289: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4fd9-519d-000000000852] 13531 1726882451.63371: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000852 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13531 1726882451.63590: no more pending results, returning what we have 13531 1726882451.63599: results queue empty 13531 1726882451.63600: checking for any_errors_fatal 13531 1726882451.63609: done checking for any_errors_fatal 13531 1726882451.63610: checking for max_fail_percentage 13531 1726882451.63612: done checking for max_fail_percentage 13531 1726882451.63613: checking to see if all hosts have failed and the running result is not ok 13531 1726882451.63614: done checking to see if all hosts have failed 13531 1726882451.63615: getting the remaining hosts for this loop 13531 1726882451.63616: done getting the remaining hosts for this loop 13531 1726882451.63620: getting the next task for host managed_node2 13531 1726882451.63629: done getting next task for host managed_node2 13531 1726882451.63633: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13531 1726882451.63637: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882451.63660: getting variables 13531 1726882451.63662: in VariableManager get_vars() 13531 1726882451.63736: Calling all_inventory to load vars for managed_node2 13531 1726882451.63739: Calling groups_inventory to load vars for managed_node2 13531 1726882451.63741: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882451.63756: Calling all_plugins_play to load vars for managed_node2 13531 1726882451.63761: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882451.63772: Calling groups_plugins_play to load vars for managed_node2 13531 1726882451.64389: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000852 13531 1726882451.64393: WORKER PROCESS EXITING 13531 1726882451.65323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882451.67820: done with get_vars() 13531 1726882451.67837: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:34:11 -0400 (0:00:00.091) 0:00:39.574 ****** 13531 1726882451.67919: entering _queue_task() for managed_node2/service_facts 13531 1726882451.68178: worker is 1 (out of 1 available) 13531 1726882451.68191: exiting _queue_task() for managed_node2/service_facts 13531 1726882451.68203: done queuing things up, now waiting for results queue to drain 13531 1726882451.68204: waiting for pending results... 13531 1726882451.68405: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 13531 1726882451.68513: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000854 13531 1726882451.68527: variable 'ansible_search_path' from source: unknown 13531 1726882451.68530: variable 'ansible_search_path' from source: unknown 13531 1726882451.68561: calling self._execute() 13531 1726882451.68637: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882451.68642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882451.68650: variable 'omit' from source: magic vars 13531 1726882451.69077: variable 'ansible_distribution_major_version' from source: facts 13531 1726882451.69081: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882451.69084: variable 'omit' from source: magic vars 13531 1726882451.69087: variable 'omit' from source: magic vars 13531 1726882451.69286: variable 'omit' from source: magic vars 13531 1726882451.69289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882451.69292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882451.69294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882451.69297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882451.69299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882451.69302: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882451.69304: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882451.69306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882451.69385: Set connection var ansible_pipelining to False 13531 1726882451.69389: Set connection var ansible_timeout to 10 13531 1726882451.69392: Set connection var ansible_shell_executable to /bin/sh 13531 1726882451.69395: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882451.69397: Set connection var ansible_connection to ssh 13531 1726882451.69399: Set connection var ansible_shell_type to sh 13531 1726882451.69424: variable 'ansible_shell_executable' from source: unknown 13531 1726882451.69428: variable 'ansible_connection' from source: unknown 13531 1726882451.69430: variable 'ansible_module_compression' from source: unknown 13531 1726882451.69433: variable 'ansible_shell_type' from source: unknown 13531 1726882451.69436: variable 'ansible_shell_executable' from source: unknown 13531 1726882451.69438: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882451.69441: variable 'ansible_pipelining' from source: unknown 13531 1726882451.69443: variable 'ansible_timeout' from source: unknown 13531 1726882451.69447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882451.69647: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882451.69660: variable 'omit' from source: magic vars 13531 1726882451.69665: starting attempt loop 13531 1726882451.69668: running the handler 13531 1726882451.69682: _low_level_execute_command(): starting 13531 1726882451.69688: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882451.70735: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882451.70740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882451.70780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882451.70785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882451.70800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882451.70819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882451.70825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882451.70906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882451.70934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882451.70937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882451.71083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882451.72774: stdout chunk (state=3): >>>/root <<< 13531 1726882451.72879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882451.72934: stderr chunk (state=3): >>><<< 13531 1726882451.72937: stdout chunk (state=3): >>><<< 13531 1726882451.72961: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882451.72974: _low_level_execute_command(): starting 13531 1726882451.72982: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882451.72959-15257-103406312520377 `" && echo ansible-tmp-1726882451.72959-15257-103406312520377="` echo /root/.ansible/tmp/ansible-tmp-1726882451.72959-15257-103406312520377 `" ) && sleep 0' 13531 1726882451.73475: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882451.73479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882451.73512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882451.73524: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882451.73593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882451.73597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882451.73600: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882451.73603: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882451.73605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882451.73607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882451.73610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882451.73612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882451.73615: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882451.73617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882451.73691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882451.73706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882451.73710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882451.73833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882451.75767: stdout chunk (state=3): >>>ansible-tmp-1726882451.72959-15257-103406312520377=/root/.ansible/tmp/ansible-tmp-1726882451.72959-15257-103406312520377 <<< 13531 1726882451.75870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882451.75976: stderr chunk (state=3): >>><<< 13531 1726882451.75986: stdout chunk (state=3): >>><<< 13531 1726882451.76084: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882451.72959-15257-103406312520377=/root/.ansible/tmp/ansible-tmp-1726882451.72959-15257-103406312520377 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882451.76091: variable 'ansible_module_compression' from source: unknown 13531 1726882451.76180: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13531 1726882451.76220: variable 'ansible_facts' from source: unknown 13531 1726882451.76320: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882451.72959-15257-103406312520377/AnsiballZ_service_facts.py 13531 1726882451.76666: Sending initial data 13531 1726882451.76676: Sent initial data (160 bytes) 13531 1726882451.78573: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882451.78577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882451.78602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882451.78610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882451.78645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882451.78648: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882451.78662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882451.78678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882451.78684: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882451.78689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882451.78698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882451.78716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882451.78719: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882451.78739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882451.78835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882451.78850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882451.78856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882451.79020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882451.80793: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882451.80892: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 13531 1726882451.80904: stderr chunk (state=3): >>>debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882451.81010: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpifvpkncq /root/.ansible/tmp/ansible-tmp-1726882451.72959-15257-103406312520377/AnsiballZ_service_facts.py <<< 13531 1726882451.81140: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882451.82561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882451.82697: stderr chunk (state=3): >>><<< 13531 1726882451.82701: stdout chunk (state=3): >>><<< 13531 1726882451.82716: done transferring module to remote 13531 1726882451.82725: _low_level_execute_command(): starting 13531 1726882451.82729: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882451.72959-15257-103406312520377/ /root/.ansible/tmp/ansible-tmp-1726882451.72959-15257-103406312520377/AnsiballZ_service_facts.py && sleep 0' 13531 1726882451.83242: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882451.83247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882451.83269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882451.83309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882451.83313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882451.83315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882451.83367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882451.83371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882451.83382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882451.83493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882451.85301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882451.85366: stderr chunk (state=3): >>><<< 13531 1726882451.85370: stdout chunk (state=3): >>><<< 13531 1726882451.85381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882451.85384: _low_level_execute_command(): starting 13531 1726882451.85392: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882451.72959-15257-103406312520377/AnsiballZ_service_facts.py && sleep 0' 13531 1726882451.85878: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882451.85882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882451.85916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882451.85928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882451.85939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882451.85993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882451.85999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882451.86010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882451.86126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882453.22541: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 13531 1726882453.22554: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "sourc<<< 13531 1726882453.22557: stdout chunk (state=3): >>>e": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.servi<<< 13531 1726882453.22561: stdout chunk (state=3): >>>ce": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "ina<<< 13531 1726882453.22590: stdout chunk (state=3): >>>ctive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "system<<< 13531 1726882453.22599: stdout chunk (state=3): >>>d"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13531 1726882453.23907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882453.23912: stdout chunk (state=3): >>><<< 13531 1726882453.23915: stderr chunk (state=3): >>><<< 13531 1726882453.24184: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882453.24782: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882451.72959-15257-103406312520377/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882453.24881: _low_level_execute_command(): starting 13531 1726882453.24893: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882451.72959-15257-103406312520377/ > /dev/null 2>&1 && sleep 0' 13531 1726882453.26799: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882453.26803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882453.26821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882453.26894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882453.26897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882453.26899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882453.27041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882453.27052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882453.27408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882453.29099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882453.29175: stderr chunk (state=3): >>><<< 13531 1726882453.29179: stdout chunk (state=3): >>><<< 13531 1726882453.29381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882453.29386: handler run complete 13531 1726882453.29482: variable 'ansible_facts' from source: unknown 13531 1726882453.29587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882453.30333: variable 'ansible_facts' from source: unknown 13531 1726882453.30712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882453.31111: attempt loop complete, returning result 13531 1726882453.31130: _execute() done 13531 1726882453.31254: dumping result to json 13531 1726882453.31394: done dumping result, returning 13531 1726882453.31409: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4fd9-519d-000000000854] 13531 1726882453.31476: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000854 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882453.32775: no more pending results, returning what we have 13531 1726882453.32778: results queue empty 13531 1726882453.32779: checking for any_errors_fatal 13531 1726882453.32783: done checking for any_errors_fatal 13531 1726882453.32784: checking for max_fail_percentage 13531 1726882453.32786: done checking for max_fail_percentage 13531 1726882453.32786: checking to see if all hosts have failed and the running result is not ok 13531 1726882453.32787: done checking to see if all hosts have failed 13531 1726882453.32788: getting the remaining hosts for this loop 13531 1726882453.32789: done getting the remaining hosts for this loop 13531 1726882453.32792: getting the next task for host managed_node2 13531 1726882453.32799: done getting next task for host managed_node2 13531 1726882453.32802: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13531 1726882453.32806: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882453.32817: getting variables 13531 1726882453.32819: in VariableManager get_vars() 13531 1726882453.32875: Calling all_inventory to load vars for managed_node2 13531 1726882453.32879: Calling groups_inventory to load vars for managed_node2 13531 1726882453.32881: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882453.32893: Calling all_plugins_play to load vars for managed_node2 13531 1726882453.32896: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882453.32899: Calling groups_plugins_play to load vars for managed_node2 13531 1726882453.33482: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000854 13531 1726882453.33486: WORKER PROCESS EXITING 13531 1726882453.35333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882453.39416: done with get_vars() 13531 1726882453.39453: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:34:13 -0400 (0:00:01.716) 0:00:41.290 ****** 13531 1726882453.39569: entering _queue_task() for managed_node2/package_facts 13531 1726882453.39923: worker is 1 (out of 1 available) 13531 1726882453.39936: exiting _queue_task() for managed_node2/package_facts 13531 1726882453.39950: done queuing things up, now waiting for results queue to drain 13531 1726882453.39951: waiting for pending results... 13531 1726882453.40270: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 13531 1726882453.40452: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000855 13531 1726882453.40477: variable 'ansible_search_path' from source: unknown 13531 1726882453.40484: variable 'ansible_search_path' from source: unknown 13531 1726882453.40533: calling self._execute() 13531 1726882453.40647: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882453.40663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882453.40680: variable 'omit' from source: magic vars 13531 1726882453.41219: variable 'ansible_distribution_major_version' from source: facts 13531 1726882453.41280: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882453.41379: variable 'omit' from source: magic vars 13531 1726882453.41462: variable 'omit' from source: magic vars 13531 1726882453.41540: variable 'omit' from source: magic vars 13531 1726882453.41643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882453.41736: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882453.41828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882453.41857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882453.41878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882453.41941: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882453.42007: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882453.42020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882453.42143: Set connection var ansible_pipelining to False 13531 1726882453.42157: Set connection var ansible_timeout to 10 13531 1726882453.42174: Set connection var ansible_shell_executable to /bin/sh 13531 1726882453.42185: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882453.42193: Set connection var ansible_connection to ssh 13531 1726882453.42198: Set connection var ansible_shell_type to sh 13531 1726882453.42229: variable 'ansible_shell_executable' from source: unknown 13531 1726882453.42241: variable 'ansible_connection' from source: unknown 13531 1726882453.42250: variable 'ansible_module_compression' from source: unknown 13531 1726882453.42260: variable 'ansible_shell_type' from source: unknown 13531 1726882453.42273: variable 'ansible_shell_executable' from source: unknown 13531 1726882453.42280: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882453.42288: variable 'ansible_pipelining' from source: unknown 13531 1726882453.42295: variable 'ansible_timeout' from source: unknown 13531 1726882453.42302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882453.42516: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882453.42533: variable 'omit' from source: magic vars 13531 1726882453.42543: starting attempt loop 13531 1726882453.42551: running the handler 13531 1726882453.42579: _low_level_execute_command(): starting 13531 1726882453.42592: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882453.44289: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882453.44306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882453.44322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882453.44341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882453.44396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882453.44410: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882453.44424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882453.44444: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882453.44460: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882453.44478: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882453.44492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882453.44506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882453.44521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882453.44533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882453.44550: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882453.44571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882453.44649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882453.44672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882453.44691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882453.44915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882453.46578: stdout chunk (state=3): >>>/root <<< 13531 1726882453.46779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882453.46783: stdout chunk (state=3): >>><<< 13531 1726882453.46785: stderr chunk (state=3): >>><<< 13531 1726882453.46872: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882453.46877: _low_level_execute_command(): starting 13531 1726882453.46881: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882453.46807-15324-91894917030942 `" && echo ansible-tmp-1726882453.46807-15324-91894917030942="` echo /root/.ansible/tmp/ansible-tmp-1726882453.46807-15324-91894917030942 `" ) && sleep 0' 13531 1726882453.48629: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882453.48652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882453.48679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882453.48699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882453.48745: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882453.48769: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882453.48789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882453.48808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882453.48822: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882453.48835: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882453.48849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882453.48874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882453.48899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882453.48911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882453.48921: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882453.48933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882453.49013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882453.49032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882453.49046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882453.49248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882453.51141: stdout chunk (state=3): >>>ansible-tmp-1726882453.46807-15324-91894917030942=/root/.ansible/tmp/ansible-tmp-1726882453.46807-15324-91894917030942 <<< 13531 1726882453.51336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882453.51340: stdout chunk (state=3): >>><<< 13531 1726882453.51342: stderr chunk (state=3): >>><<< 13531 1726882453.51571: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882453.46807-15324-91894917030942=/root/.ansible/tmp/ansible-tmp-1726882453.46807-15324-91894917030942 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882453.51574: variable 'ansible_module_compression' from source: unknown 13531 1726882453.51576: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13531 1726882453.51578: variable 'ansible_facts' from source: unknown 13531 1726882453.51730: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882453.46807-15324-91894917030942/AnsiballZ_package_facts.py 13531 1726882453.52266: Sending initial data 13531 1726882453.52278: Sent initial data (159 bytes) 13531 1726882453.56400: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882453.56577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882453.56594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882453.56615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882453.56674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882453.56783: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882453.56800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882453.56819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882453.56832: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882453.56844: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882453.56859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882453.56876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882453.56897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882453.56909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882453.56920: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882453.56934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882453.57182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882453.57200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882453.57217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882453.57565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882453.59311: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882453.59413: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882453.59509: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpvxt4mipv /root/.ansible/tmp/ansible-tmp-1726882453.46807-15324-91894917030942/AnsiballZ_package_facts.py <<< 13531 1726882453.59602: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882453.62912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882453.63088: stderr chunk (state=3): >>><<< 13531 1726882453.63092: stdout chunk (state=3): >>><<< 13531 1726882453.63094: done transferring module to remote 13531 1726882453.63101: _low_level_execute_command(): starting 13531 1726882453.63104: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882453.46807-15324-91894917030942/ /root/.ansible/tmp/ansible-tmp-1726882453.46807-15324-91894917030942/AnsiballZ_package_facts.py && sleep 0' 13531 1726882453.64448: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882453.64459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882453.64468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882453.64483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882453.64531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882453.64538: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882453.64547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882453.64562: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882453.64571: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882453.64578: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882453.64585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882453.64596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882453.64612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882453.64619: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882453.64627: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882453.64634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882453.64725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882453.64745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882453.64759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882453.64893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882453.67165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882453.67170: stdout chunk (state=3): >>><<< 13531 1726882453.67175: stderr chunk (state=3): >>><<< 13531 1726882453.67194: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882453.67197: _low_level_execute_command(): starting 13531 1726882453.67199: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882453.46807-15324-91894917030942/AnsiballZ_package_facts.py && sleep 0' 13531 1726882453.68723: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882453.68743: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882453.68759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882453.68780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882453.68828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882453.68979: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882453.68993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882453.69010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882453.69020: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882453.69030: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882453.69040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882453.69052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882453.69074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882453.69182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882453.69199: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882453.69212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882453.69295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882453.69313: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882453.69328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882453.69530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882454.17198: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_<<< 13531 1726882454.17217: stdout chunk (state=3): >>>64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [<<< 13531 1726882454.17225: stdout chunk (state=3): >>>{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba<<< 13531 1726882454.17230: stdout chunk (state=3): >>>", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "py<<< 13531 1726882454.17237: stdout chunk (state=3): >>>thon3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epo<<< 13531 1726882454.17326: stdout chunk (state=3): >>>ch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rp<<< 13531 1726882454.17339: stdout chunk (state=3): >>>m"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1"<<< 13531 1726882454.17342: stdout chunk (state=3): >>>, "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "pe<<< 13531 1726882454.17347: stdout chunk (state=3): >>>rl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "<<< 13531 1726882454.17379: stdout chunk (state=3): >>>8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "relea<<< 13531 1726882454.17386: stdout chunk (state=3): >>>se": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13531 1726882454.18909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882454.18912: stdout chunk (state=3): >>><<< 13531 1726882454.18915: stderr chunk (state=3): >>><<< 13531 1726882454.19182: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882454.21848: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882453.46807-15324-91894917030942/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882454.21869: _low_level_execute_command(): starting 13531 1726882454.21872: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882453.46807-15324-91894917030942/ > /dev/null 2>&1 && sleep 0' 13531 1726882454.22510: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882454.22519: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882454.22529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882454.22543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882454.22587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882454.22594: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882454.22613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882454.22627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882454.22635: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882454.22641: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882454.22649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882454.22659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882454.22679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882454.22686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882454.22693: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882454.22702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882454.22792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882454.22801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882454.22804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882454.22945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882454.24811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882454.24901: stderr chunk (state=3): >>><<< 13531 1726882454.24904: stdout chunk (state=3): >>><<< 13531 1726882454.24922: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882454.24928: handler run complete 13531 1726882454.25882: variable 'ansible_facts' from source: unknown 13531 1726882454.26390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882454.28492: variable 'ansible_facts' from source: unknown 13531 1726882454.28965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882454.29707: attempt loop complete, returning result 13531 1726882454.29719: _execute() done 13531 1726882454.29722: dumping result to json 13531 1726882454.29958: done dumping result, returning 13531 1726882454.29968: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4fd9-519d-000000000855] 13531 1726882454.29974: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000855 13531 1726882454.32387: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000855 13531 1726882454.32390: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882454.32597: no more pending results, returning what we have 13531 1726882454.32599: results queue empty 13531 1726882454.32600: checking for any_errors_fatal 13531 1726882454.32608: done checking for any_errors_fatal 13531 1726882454.32609: checking for max_fail_percentage 13531 1726882454.32611: done checking for max_fail_percentage 13531 1726882454.32612: checking to see if all hosts have failed and the running result is not ok 13531 1726882454.32613: done checking to see if all hosts have failed 13531 1726882454.32613: getting the remaining hosts for this loop 13531 1726882454.32615: done getting the remaining hosts for this loop 13531 1726882454.32619: getting the next task for host managed_node2 13531 1726882454.32625: done getting next task for host managed_node2 13531 1726882454.32629: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13531 1726882454.32632: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882454.32642: getting variables 13531 1726882454.32644: in VariableManager get_vars() 13531 1726882454.32699: Calling all_inventory to load vars for managed_node2 13531 1726882454.32703: Calling groups_inventory to load vars for managed_node2 13531 1726882454.32705: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882454.32715: Calling all_plugins_play to load vars for managed_node2 13531 1726882454.32718: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882454.32721: Calling groups_plugins_play to load vars for managed_node2 13531 1726882454.34407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882454.37569: done with get_vars() 13531 1726882454.37612: done getting variables 13531 1726882454.37673: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:34:14 -0400 (0:00:00.981) 0:00:42.272 ****** 13531 1726882454.37708: entering _queue_task() for managed_node2/debug 13531 1726882454.38028: worker is 1 (out of 1 available) 13531 1726882454.38040: exiting _queue_task() for managed_node2/debug 13531 1726882454.38053: done queuing things up, now waiting for results queue to drain 13531 1726882454.38054: waiting for pending results... 13531 1726882454.38345: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 13531 1726882454.38473: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000011c 13531 1726882454.38485: variable 'ansible_search_path' from source: unknown 13531 1726882454.38489: variable 'ansible_search_path' from source: unknown 13531 1726882454.38531: calling self._execute() 13531 1726882454.38634: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882454.38638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882454.38648: variable 'omit' from source: magic vars 13531 1726882454.39033: variable 'ansible_distribution_major_version' from source: facts 13531 1726882454.39051: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882454.39059: variable 'omit' from source: magic vars 13531 1726882454.39117: variable 'omit' from source: magic vars 13531 1726882454.39212: variable 'network_provider' from source: set_fact 13531 1726882454.39229: variable 'omit' from source: magic vars 13531 1726882454.39276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882454.39310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882454.39328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882454.39345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882454.39358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882454.39390: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882454.39393: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882454.39396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882454.39503: Set connection var ansible_pipelining to False 13531 1726882454.39508: Set connection var ansible_timeout to 10 13531 1726882454.39513: Set connection var ansible_shell_executable to /bin/sh 13531 1726882454.39519: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882454.39521: Set connection var ansible_connection to ssh 13531 1726882454.39523: Set connection var ansible_shell_type to sh 13531 1726882454.39550: variable 'ansible_shell_executable' from source: unknown 13531 1726882454.39556: variable 'ansible_connection' from source: unknown 13531 1726882454.39559: variable 'ansible_module_compression' from source: unknown 13531 1726882454.39562: variable 'ansible_shell_type' from source: unknown 13531 1726882454.39567: variable 'ansible_shell_executable' from source: unknown 13531 1726882454.39569: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882454.39571: variable 'ansible_pipelining' from source: unknown 13531 1726882454.39573: variable 'ansible_timeout' from source: unknown 13531 1726882454.39575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882454.39719: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882454.39730: variable 'omit' from source: magic vars 13531 1726882454.39736: starting attempt loop 13531 1726882454.39739: running the handler 13531 1726882454.39788: handler run complete 13531 1726882454.39805: attempt loop complete, returning result 13531 1726882454.39808: _execute() done 13531 1726882454.39811: dumping result to json 13531 1726882454.39813: done dumping result, returning 13531 1726882454.39821: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4fd9-519d-00000000011c] 13531 1726882454.39828: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000011c 13531 1726882454.39925: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000011c 13531 1726882454.39929: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 13531 1726882454.40002: no more pending results, returning what we have 13531 1726882454.40006: results queue empty 13531 1726882454.40007: checking for any_errors_fatal 13531 1726882454.40016: done checking for any_errors_fatal 13531 1726882454.40016: checking for max_fail_percentage 13531 1726882454.40018: done checking for max_fail_percentage 13531 1726882454.40019: checking to see if all hosts have failed and the running result is not ok 13531 1726882454.40020: done checking to see if all hosts have failed 13531 1726882454.40021: getting the remaining hosts for this loop 13531 1726882454.40022: done getting the remaining hosts for this loop 13531 1726882454.40026: getting the next task for host managed_node2 13531 1726882454.40032: done getting next task for host managed_node2 13531 1726882454.40036: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13531 1726882454.40040: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882454.40052: getting variables 13531 1726882454.40055: in VariableManager get_vars() 13531 1726882454.40114: Calling all_inventory to load vars for managed_node2 13531 1726882454.40117: Calling groups_inventory to load vars for managed_node2 13531 1726882454.40120: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882454.40131: Calling all_plugins_play to load vars for managed_node2 13531 1726882454.40134: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882454.40136: Calling groups_plugins_play to load vars for managed_node2 13531 1726882454.43076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882454.47333: done with get_vars() 13531 1726882454.47375: done getting variables 13531 1726882454.47440: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:34:14 -0400 (0:00:00.097) 0:00:42.369 ****** 13531 1726882454.47480: entering _queue_task() for managed_node2/fail 13531 1726882454.47820: worker is 1 (out of 1 available) 13531 1726882454.47832: exiting _queue_task() for managed_node2/fail 13531 1726882454.47845: done queuing things up, now waiting for results queue to drain 13531 1726882454.47847: waiting for pending results... 13531 1726882454.48881: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13531 1726882454.49159: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000011d 13531 1726882454.49276: variable 'ansible_search_path' from source: unknown 13531 1726882454.49287: variable 'ansible_search_path' from source: unknown 13531 1726882454.49657: calling self._execute() 13531 1726882454.49780: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882454.49785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882454.49795: variable 'omit' from source: magic vars 13531 1726882454.50183: variable 'ansible_distribution_major_version' from source: facts 13531 1726882454.50196: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882454.50321: variable 'network_state' from source: role '' defaults 13531 1726882454.50330: Evaluated conditional (network_state != {}): False 13531 1726882454.50334: when evaluation is False, skipping this task 13531 1726882454.50337: _execute() done 13531 1726882454.50340: dumping result to json 13531 1726882454.50343: done dumping result, returning 13531 1726882454.50349: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4fd9-519d-00000000011d] 13531 1726882454.50359: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000011d 13531 1726882454.50453: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000011d 13531 1726882454.50459: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882454.50510: no more pending results, returning what we have 13531 1726882454.50514: results queue empty 13531 1726882454.50515: checking for any_errors_fatal 13531 1726882454.50524: done checking for any_errors_fatal 13531 1726882454.50524: checking for max_fail_percentage 13531 1726882454.50526: done checking for max_fail_percentage 13531 1726882454.50527: checking to see if all hosts have failed and the running result is not ok 13531 1726882454.50528: done checking to see if all hosts have failed 13531 1726882454.50529: getting the remaining hosts for this loop 13531 1726882454.50531: done getting the remaining hosts for this loop 13531 1726882454.50534: getting the next task for host managed_node2 13531 1726882454.50541: done getting next task for host managed_node2 13531 1726882454.50545: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13531 1726882454.50549: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882454.50573: getting variables 13531 1726882454.50575: in VariableManager get_vars() 13531 1726882454.50633: Calling all_inventory to load vars for managed_node2 13531 1726882454.50636: Calling groups_inventory to load vars for managed_node2 13531 1726882454.50639: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882454.50652: Calling all_plugins_play to load vars for managed_node2 13531 1726882454.50655: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882454.50658: Calling groups_plugins_play to load vars for managed_node2 13531 1726882454.52541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882454.54240: done with get_vars() 13531 1726882454.54280: done getting variables 13531 1726882454.54344: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:34:14 -0400 (0:00:00.069) 0:00:42.439 ****** 13531 1726882454.54386: entering _queue_task() for managed_node2/fail 13531 1726882454.54716: worker is 1 (out of 1 available) 13531 1726882454.54728: exiting _queue_task() for managed_node2/fail 13531 1726882454.54741: done queuing things up, now waiting for results queue to drain 13531 1726882454.54742: waiting for pending results... 13531 1726882454.55037: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13531 1726882454.55192: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000011e 13531 1726882454.55211: variable 'ansible_search_path' from source: unknown 13531 1726882454.55217: variable 'ansible_search_path' from source: unknown 13531 1726882454.55259: calling self._execute() 13531 1726882454.55367: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882454.55380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882454.55394: variable 'omit' from source: magic vars 13531 1726882454.55772: variable 'ansible_distribution_major_version' from source: facts 13531 1726882454.55790: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882454.55915: variable 'network_state' from source: role '' defaults 13531 1726882454.55930: Evaluated conditional (network_state != {}): False 13531 1726882454.55938: when evaluation is False, skipping this task 13531 1726882454.55949: _execute() done 13531 1726882454.55957: dumping result to json 13531 1726882454.55965: done dumping result, returning 13531 1726882454.55977: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4fd9-519d-00000000011e] 13531 1726882454.55989: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000011e skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882454.56135: no more pending results, returning what we have 13531 1726882454.56140: results queue empty 13531 1726882454.56141: checking for any_errors_fatal 13531 1726882454.56148: done checking for any_errors_fatal 13531 1726882454.56149: checking for max_fail_percentage 13531 1726882454.56151: done checking for max_fail_percentage 13531 1726882454.56152: checking to see if all hosts have failed and the running result is not ok 13531 1726882454.56153: done checking to see if all hosts have failed 13531 1726882454.56154: getting the remaining hosts for this loop 13531 1726882454.56155: done getting the remaining hosts for this loop 13531 1726882454.56158: getting the next task for host managed_node2 13531 1726882454.56167: done getting next task for host managed_node2 13531 1726882454.56173: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13531 1726882454.56177: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882454.56199: getting variables 13531 1726882454.56201: in VariableManager get_vars() 13531 1726882454.56258: Calling all_inventory to load vars for managed_node2 13531 1726882454.56261: Calling groups_inventory to load vars for managed_node2 13531 1726882454.56265: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882454.56279: Calling all_plugins_play to load vars for managed_node2 13531 1726882454.56282: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882454.56285: Calling groups_plugins_play to load vars for managed_node2 13531 1726882454.57288: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000011e 13531 1726882454.57292: WORKER PROCESS EXITING 13531 1726882454.58047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882454.61358: done with get_vars() 13531 1726882454.61401: done getting variables 13531 1726882454.61468: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:34:14 -0400 (0:00:00.071) 0:00:42.510 ****** 13531 1726882454.61504: entering _queue_task() for managed_node2/fail 13531 1726882454.61829: worker is 1 (out of 1 available) 13531 1726882454.61842: exiting _queue_task() for managed_node2/fail 13531 1726882454.61855: done queuing things up, now waiting for results queue to drain 13531 1726882454.61856: waiting for pending results... 13531 1726882454.63224: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13531 1726882454.63631: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000011f 13531 1726882454.63651: variable 'ansible_search_path' from source: unknown 13531 1726882454.63665: variable 'ansible_search_path' from source: unknown 13531 1726882454.63715: calling self._execute() 13531 1726882454.63900: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882454.64025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882454.64038: variable 'omit' from source: magic vars 13531 1726882454.64910: variable 'ansible_distribution_major_version' from source: facts 13531 1726882454.64928: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882454.65113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882454.67933: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882454.67999: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882454.68041: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882454.68146: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882454.68258: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882454.68419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882454.68480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882454.68650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882454.68706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882454.68727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882454.68973: variable 'ansible_distribution_major_version' from source: facts 13531 1726882454.69007: Evaluated conditional (ansible_distribution_major_version | int > 9): False 13531 1726882454.69104: when evaluation is False, skipping this task 13531 1726882454.69112: _execute() done 13531 1726882454.69119: dumping result to json 13531 1726882454.69127: done dumping result, returning 13531 1726882454.69139: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4fd9-519d-00000000011f] 13531 1726882454.69150: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000011f skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 13531 1726882454.69303: no more pending results, returning what we have 13531 1726882454.69308: results queue empty 13531 1726882454.69310: checking for any_errors_fatal 13531 1726882454.69316: done checking for any_errors_fatal 13531 1726882454.69317: checking for max_fail_percentage 13531 1726882454.69319: done checking for max_fail_percentage 13531 1726882454.69320: checking to see if all hosts have failed and the running result is not ok 13531 1726882454.69321: done checking to see if all hosts have failed 13531 1726882454.69322: getting the remaining hosts for this loop 13531 1726882454.69323: done getting the remaining hosts for this loop 13531 1726882454.69326: getting the next task for host managed_node2 13531 1726882454.69333: done getting next task for host managed_node2 13531 1726882454.69337: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13531 1726882454.69340: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882454.69361: getting variables 13531 1726882454.69365: in VariableManager get_vars() 13531 1726882454.69425: Calling all_inventory to load vars for managed_node2 13531 1726882454.69428: Calling groups_inventory to load vars for managed_node2 13531 1726882454.69431: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882454.69443: Calling all_plugins_play to load vars for managed_node2 13531 1726882454.69447: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882454.69451: Calling groups_plugins_play to load vars for managed_node2 13531 1726882454.70894: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000011f 13531 1726882454.70898: WORKER PROCESS EXITING 13531 1726882454.72784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882454.76577: done with get_vars() 13531 1726882454.76617: done getting variables 13531 1726882454.76684: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:34:14 -0400 (0:00:00.152) 0:00:42.662 ****** 13531 1726882454.76719: entering _queue_task() for managed_node2/dnf 13531 1726882454.77060: worker is 1 (out of 1 available) 13531 1726882454.78902: exiting _queue_task() for managed_node2/dnf 13531 1726882454.78914: done queuing things up, now waiting for results queue to drain 13531 1726882454.78915: waiting for pending results... 13531 1726882454.78939: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13531 1726882454.79393: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000120 13531 1726882454.79416: variable 'ansible_search_path' from source: unknown 13531 1726882454.79424: variable 'ansible_search_path' from source: unknown 13531 1726882454.79477: calling self._execute() 13531 1726882454.79760: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882454.79775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882454.79789: variable 'omit' from source: magic vars 13531 1726882454.80699: variable 'ansible_distribution_major_version' from source: facts 13531 1726882454.80720: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882454.81145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882454.86575: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882454.86721: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882454.86847: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882454.86938: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882454.87005: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882454.87153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882454.87376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882454.87416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882454.87462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882454.87636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882454.87779: variable 'ansible_distribution' from source: facts 13531 1726882454.87848: variable 'ansible_distribution_major_version' from source: facts 13531 1726882454.87882: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13531 1726882454.89041: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882454.89317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882454.89592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882454.89700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882454.89746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882454.89798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882454.89899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882454.89928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882454.89959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882454.90011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882454.90030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882454.90080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882454.90115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882454.90144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882454.90191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882454.90217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882454.90397: variable 'network_connections' from source: task vars 13531 1726882454.90420: variable 'port1_profile' from source: play vars 13531 1726882454.90501: variable 'port1_profile' from source: play vars 13531 1726882454.90519: variable 'port2_profile' from source: play vars 13531 1726882454.90594: variable 'port2_profile' from source: play vars 13531 1726882454.90682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882454.90883: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882454.90928: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882454.90973: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882454.91010: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882454.91062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882454.91098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882454.91139: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882454.91186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882454.91252: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882454.91971: variable 'network_connections' from source: task vars 13531 1726882454.91984: variable 'port1_profile' from source: play vars 13531 1726882454.92176: variable 'port1_profile' from source: play vars 13531 1726882454.92191: variable 'port2_profile' from source: play vars 13531 1726882454.92256: variable 'port2_profile' from source: play vars 13531 1726882454.92402: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882454.92411: when evaluation is False, skipping this task 13531 1726882454.92419: _execute() done 13531 1726882454.92425: dumping result to json 13531 1726882454.92432: done dumping result, returning 13531 1726882454.92444: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-000000000120] 13531 1726882454.92456: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000120 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882454.92724: no more pending results, returning what we have 13531 1726882454.92729: results queue empty 13531 1726882454.92730: checking for any_errors_fatal 13531 1726882454.92736: done checking for any_errors_fatal 13531 1726882454.92737: checking for max_fail_percentage 13531 1726882454.92739: done checking for max_fail_percentage 13531 1726882454.92740: checking to see if all hosts have failed and the running result is not ok 13531 1726882454.92741: done checking to see if all hosts have failed 13531 1726882454.92742: getting the remaining hosts for this loop 13531 1726882454.92744: done getting the remaining hosts for this loop 13531 1726882454.92747: getting the next task for host managed_node2 13531 1726882454.92755: done getting next task for host managed_node2 13531 1726882454.92760: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13531 1726882454.92766: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882454.92787: getting variables 13531 1726882454.92789: in VariableManager get_vars() 13531 1726882454.92849: Calling all_inventory to load vars for managed_node2 13531 1726882454.92853: Calling groups_inventory to load vars for managed_node2 13531 1726882454.92855: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882454.92869: Calling all_plugins_play to load vars for managed_node2 13531 1726882454.92873: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882454.92877: Calling groups_plugins_play to load vars for managed_node2 13531 1726882454.93915: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000120 13531 1726882454.93919: WORKER PROCESS EXITING 13531 1726882454.94810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882454.97545: done with get_vars() 13531 1726882454.97585: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13531 1726882454.97676: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:34:14 -0400 (0:00:00.209) 0:00:42.872 ****** 13531 1726882454.97709: entering _queue_task() for managed_node2/yum 13531 1726882454.98087: worker is 1 (out of 1 available) 13531 1726882454.98100: exiting _queue_task() for managed_node2/yum 13531 1726882454.98113: done queuing things up, now waiting for results queue to drain 13531 1726882454.98115: waiting for pending results... 13531 1726882454.98434: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13531 1726882454.98577: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000121 13531 1726882454.98594: variable 'ansible_search_path' from source: unknown 13531 1726882454.98598: variable 'ansible_search_path' from source: unknown 13531 1726882454.98638: calling self._execute() 13531 1726882454.98743: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882454.98747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882454.98757: variable 'omit' from source: magic vars 13531 1726882454.99170: variable 'ansible_distribution_major_version' from source: facts 13531 1726882454.99182: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882454.99433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882455.03371: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882455.03527: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882455.03569: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882455.03639: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882455.03673: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882455.03768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.03822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.03852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.03902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.03917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.04029: variable 'ansible_distribution_major_version' from source: facts 13531 1726882455.04044: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13531 1726882455.04047: when evaluation is False, skipping this task 13531 1726882455.04050: _execute() done 13531 1726882455.04052: dumping result to json 13531 1726882455.04060: done dumping result, returning 13531 1726882455.04070: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-000000000121] 13531 1726882455.04077: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000121 13531 1726882455.04180: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000121 13531 1726882455.04183: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13531 1726882455.04231: no more pending results, returning what we have 13531 1726882455.04235: results queue empty 13531 1726882455.04236: checking for any_errors_fatal 13531 1726882455.04242: done checking for any_errors_fatal 13531 1726882455.04243: checking for max_fail_percentage 13531 1726882455.04244: done checking for max_fail_percentage 13531 1726882455.04245: checking to see if all hosts have failed and the running result is not ok 13531 1726882455.04246: done checking to see if all hosts have failed 13531 1726882455.04247: getting the remaining hosts for this loop 13531 1726882455.04248: done getting the remaining hosts for this loop 13531 1726882455.04251: getting the next task for host managed_node2 13531 1726882455.04257: done getting next task for host managed_node2 13531 1726882455.04261: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13531 1726882455.04266: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882455.04287: getting variables 13531 1726882455.04288: in VariableManager get_vars() 13531 1726882455.04343: Calling all_inventory to load vars for managed_node2 13531 1726882455.04346: Calling groups_inventory to load vars for managed_node2 13531 1726882455.04348: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882455.04358: Calling all_plugins_play to load vars for managed_node2 13531 1726882455.04361: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882455.04365: Calling groups_plugins_play to load vars for managed_node2 13531 1726882455.07201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882455.10150: done with get_vars() 13531 1726882455.10187: done getting variables 13531 1726882455.10260: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:34:15 -0400 (0:00:00.125) 0:00:42.998 ****** 13531 1726882455.10299: entering _queue_task() for managed_node2/fail 13531 1726882455.10672: worker is 1 (out of 1 available) 13531 1726882455.10684: exiting _queue_task() for managed_node2/fail 13531 1726882455.10696: done queuing things up, now waiting for results queue to drain 13531 1726882455.10697: waiting for pending results... 13531 1726882455.11010: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13531 1726882455.11135: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000122 13531 1726882455.11152: variable 'ansible_search_path' from source: unknown 13531 1726882455.11155: variable 'ansible_search_path' from source: unknown 13531 1726882455.11199: calling self._execute() 13531 1726882455.11566: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882455.11574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882455.11588: variable 'omit' from source: magic vars 13531 1726882455.11956: variable 'ansible_distribution_major_version' from source: facts 13531 1726882455.12077: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882455.12313: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882455.12730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882455.16081: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882455.16145: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882455.16189: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882455.16223: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882455.16248: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882455.16334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.16366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.16395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.16437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.16452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.16506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.16529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.16554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.16601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.16616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.16655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.16683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.16712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.16751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.16770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.16956: variable 'network_connections' from source: task vars 13531 1726882455.16972: variable 'port1_profile' from source: play vars 13531 1726882455.17043: variable 'port1_profile' from source: play vars 13531 1726882455.17054: variable 'port2_profile' from source: play vars 13531 1726882455.17121: variable 'port2_profile' from source: play vars 13531 1726882455.17200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882455.17395: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882455.17431: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882455.17468: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882455.17498: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882455.17540: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882455.17564: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882455.17591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.17614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882455.17666: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882455.17903: variable 'network_connections' from source: task vars 13531 1726882455.17906: variable 'port1_profile' from source: play vars 13531 1726882455.17967: variable 'port1_profile' from source: play vars 13531 1726882455.17973: variable 'port2_profile' from source: play vars 13531 1726882455.18034: variable 'port2_profile' from source: play vars 13531 1726882455.18060: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882455.18065: when evaluation is False, skipping this task 13531 1726882455.18067: _execute() done 13531 1726882455.18070: dumping result to json 13531 1726882455.18074: done dumping result, returning 13531 1726882455.18083: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-000000000122] 13531 1726882455.18094: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000122 13531 1726882455.18190: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000122 13531 1726882455.18193: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882455.18262: no more pending results, returning what we have 13531 1726882455.18267: results queue empty 13531 1726882455.18268: checking for any_errors_fatal 13531 1726882455.18275: done checking for any_errors_fatal 13531 1726882455.18276: checking for max_fail_percentage 13531 1726882455.18278: done checking for max_fail_percentage 13531 1726882455.18279: checking to see if all hosts have failed and the running result is not ok 13531 1726882455.18279: done checking to see if all hosts have failed 13531 1726882455.18280: getting the remaining hosts for this loop 13531 1726882455.18281: done getting the remaining hosts for this loop 13531 1726882455.18285: getting the next task for host managed_node2 13531 1726882455.18290: done getting next task for host managed_node2 13531 1726882455.18294: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13531 1726882455.18297: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882455.18317: getting variables 13531 1726882455.18318: in VariableManager get_vars() 13531 1726882455.18371: Calling all_inventory to load vars for managed_node2 13531 1726882455.18374: Calling groups_inventory to load vars for managed_node2 13531 1726882455.18376: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882455.18386: Calling all_plugins_play to load vars for managed_node2 13531 1726882455.18388: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882455.18391: Calling groups_plugins_play to load vars for managed_node2 13531 1726882455.19933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882455.22017: done with get_vars() 13531 1726882455.22049: done getting variables 13531 1726882455.22125: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:34:15 -0400 (0:00:00.118) 0:00:43.116 ****** 13531 1726882455.22166: entering _queue_task() for managed_node2/package 13531 1726882455.22531: worker is 1 (out of 1 available) 13531 1726882455.22544: exiting _queue_task() for managed_node2/package 13531 1726882455.22560: done queuing things up, now waiting for results queue to drain 13531 1726882455.22562: waiting for pending results... 13531 1726882455.22884: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 13531 1726882455.23018: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000123 13531 1726882455.23031: variable 'ansible_search_path' from source: unknown 13531 1726882455.23035: variable 'ansible_search_path' from source: unknown 13531 1726882455.23078: calling self._execute() 13531 1726882455.23180: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882455.23185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882455.23200: variable 'omit' from source: magic vars 13531 1726882455.23587: variable 'ansible_distribution_major_version' from source: facts 13531 1726882455.23599: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882455.23803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882455.24086: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882455.24129: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882455.24161: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882455.24246: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882455.24368: variable 'network_packages' from source: role '' defaults 13531 1726882455.24484: variable '__network_provider_setup' from source: role '' defaults 13531 1726882455.24494: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882455.24568: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882455.24576: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882455.24642: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882455.24842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882455.27142: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882455.27207: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882455.27254: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882455.27290: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882455.27317: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882455.27408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.27440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.27476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.27518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.27532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.27591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.27614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.27638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.27688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.27701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.27952: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13531 1726882455.28079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.28112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.28136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.28179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.28193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.28295: variable 'ansible_python' from source: facts 13531 1726882455.28329: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13531 1726882455.28417: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882455.28505: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882455.28666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.28689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.28713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.28763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.28781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.28825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.28852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.28887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.28926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.28941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.29103: variable 'network_connections' from source: task vars 13531 1726882455.29110: variable 'port1_profile' from source: play vars 13531 1726882455.29218: variable 'port1_profile' from source: play vars 13531 1726882455.29228: variable 'port2_profile' from source: play vars 13531 1726882455.29330: variable 'port2_profile' from source: play vars 13531 1726882455.29400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882455.29428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882455.29455: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.29488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882455.29541: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882455.29841: variable 'network_connections' from source: task vars 13531 1726882455.29847: variable 'port1_profile' from source: play vars 13531 1726882455.29956: variable 'port1_profile' from source: play vars 13531 1726882455.29971: variable 'port2_profile' from source: play vars 13531 1726882455.30066: variable 'port2_profile' from source: play vars 13531 1726882455.30095: variable '__network_packages_default_wireless' from source: role '' defaults 13531 1726882455.30182: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882455.30512: variable 'network_connections' from source: task vars 13531 1726882455.30515: variable 'port1_profile' from source: play vars 13531 1726882455.30584: variable 'port1_profile' from source: play vars 13531 1726882455.30601: variable 'port2_profile' from source: play vars 13531 1726882455.30667: variable 'port2_profile' from source: play vars 13531 1726882455.30689: variable '__network_packages_default_team' from source: role '' defaults 13531 1726882455.30777: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882455.31120: variable 'network_connections' from source: task vars 13531 1726882455.31123: variable 'port1_profile' from source: play vars 13531 1726882455.31198: variable 'port1_profile' from source: play vars 13531 1726882455.31205: variable 'port2_profile' from source: play vars 13531 1726882455.31278: variable 'port2_profile' from source: play vars 13531 1726882455.31330: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882455.31399: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882455.31405: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882455.31474: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882455.32068: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13531 1726882455.32221: variable 'network_connections' from source: task vars 13531 1726882455.32230: variable 'port1_profile' from source: play vars 13531 1726882455.32291: variable 'port1_profile' from source: play vars 13531 1726882455.32298: variable 'port2_profile' from source: play vars 13531 1726882455.32357: variable 'port2_profile' from source: play vars 13531 1726882455.32373: variable 'ansible_distribution' from source: facts 13531 1726882455.32377: variable '__network_rh_distros' from source: role '' defaults 13531 1726882455.32382: variable 'ansible_distribution_major_version' from source: facts 13531 1726882455.32395: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13531 1726882455.32545: variable 'ansible_distribution' from source: facts 13531 1726882455.32555: variable '__network_rh_distros' from source: role '' defaults 13531 1726882455.32562: variable 'ansible_distribution_major_version' from source: facts 13531 1726882455.32579: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13531 1726882455.32731: variable 'ansible_distribution' from source: facts 13531 1726882455.32734: variable '__network_rh_distros' from source: role '' defaults 13531 1726882455.32739: variable 'ansible_distribution_major_version' from source: facts 13531 1726882455.32783: variable 'network_provider' from source: set_fact 13531 1726882455.32799: variable 'ansible_facts' from source: unknown 13531 1726882455.33811: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13531 1726882455.33814: when evaluation is False, skipping this task 13531 1726882455.33817: _execute() done 13531 1726882455.33820: dumping result to json 13531 1726882455.33822: done dumping result, returning 13531 1726882455.33830: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4fd9-519d-000000000123] 13531 1726882455.33835: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000123 13531 1726882455.33940: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000123 13531 1726882455.33943: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13531 1726882455.34010: no more pending results, returning what we have 13531 1726882455.34014: results queue empty 13531 1726882455.34015: checking for any_errors_fatal 13531 1726882455.34022: done checking for any_errors_fatal 13531 1726882455.34023: checking for max_fail_percentage 13531 1726882455.34025: done checking for max_fail_percentage 13531 1726882455.34025: checking to see if all hosts have failed and the running result is not ok 13531 1726882455.34026: done checking to see if all hosts have failed 13531 1726882455.34027: getting the remaining hosts for this loop 13531 1726882455.34028: done getting the remaining hosts for this loop 13531 1726882455.34031: getting the next task for host managed_node2 13531 1726882455.34037: done getting next task for host managed_node2 13531 1726882455.34041: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13531 1726882455.34044: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882455.34071: getting variables 13531 1726882455.34072: in VariableManager get_vars() 13531 1726882455.34121: Calling all_inventory to load vars for managed_node2 13531 1726882455.34123: Calling groups_inventory to load vars for managed_node2 13531 1726882455.34126: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882455.34135: Calling all_plugins_play to load vars for managed_node2 13531 1726882455.34138: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882455.34140: Calling groups_plugins_play to load vars for managed_node2 13531 1726882455.35573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882455.37492: done with get_vars() 13531 1726882455.37515: done getting variables 13531 1726882455.37577: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:34:15 -0400 (0:00:00.154) 0:00:43.271 ****** 13531 1726882455.37617: entering _queue_task() for managed_node2/package 13531 1726882455.38329: worker is 1 (out of 1 available) 13531 1726882455.38341: exiting _queue_task() for managed_node2/package 13531 1726882455.38357: done queuing things up, now waiting for results queue to drain 13531 1726882455.38359: waiting for pending results... 13531 1726882455.38674: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13531 1726882455.38819: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000124 13531 1726882455.38830: variable 'ansible_search_path' from source: unknown 13531 1726882455.38834: variable 'ansible_search_path' from source: unknown 13531 1726882455.38881: calling self._execute() 13531 1726882455.38989: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882455.38993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882455.39004: variable 'omit' from source: magic vars 13531 1726882455.39717: variable 'ansible_distribution_major_version' from source: facts 13531 1726882455.39730: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882455.39863: variable 'network_state' from source: role '' defaults 13531 1726882455.39883: Evaluated conditional (network_state != {}): False 13531 1726882455.39887: when evaluation is False, skipping this task 13531 1726882455.39890: _execute() done 13531 1726882455.39893: dumping result to json 13531 1726882455.39895: done dumping result, returning 13531 1726882455.39903: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4fd9-519d-000000000124] 13531 1726882455.39909: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000124 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882455.40060: no more pending results, returning what we have 13531 1726882455.40082: results queue empty 13531 1726882455.40085: checking for any_errors_fatal 13531 1726882455.40094: done checking for any_errors_fatal 13531 1726882455.40095: checking for max_fail_percentage 13531 1726882455.40097: done checking for max_fail_percentage 13531 1726882455.40098: checking to see if all hosts have failed and the running result is not ok 13531 1726882455.40099: done checking to see if all hosts have failed 13531 1726882455.40100: getting the remaining hosts for this loop 13531 1726882455.40102: done getting the remaining hosts for this loop 13531 1726882455.40105: getting the next task for host managed_node2 13531 1726882455.40112: done getting next task for host managed_node2 13531 1726882455.40116: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13531 1726882455.40121: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882455.40149: getting variables 13531 1726882455.40151: in VariableManager get_vars() 13531 1726882455.40213: Calling all_inventory to load vars for managed_node2 13531 1726882455.40216: Calling groups_inventory to load vars for managed_node2 13531 1726882455.40219: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882455.40234: Calling all_plugins_play to load vars for managed_node2 13531 1726882455.40238: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882455.40242: Calling groups_plugins_play to load vars for managed_node2 13531 1726882455.40871: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000124 13531 1726882455.40875: WORKER PROCESS EXITING 13531 1726882455.42146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882455.45752: done with get_vars() 13531 1726882455.45793: done getting variables 13531 1726882455.45859: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:34:15 -0400 (0:00:00.082) 0:00:43.354 ****** 13531 1726882455.45900: entering _queue_task() for managed_node2/package 13531 1726882455.46550: worker is 1 (out of 1 available) 13531 1726882455.46769: exiting _queue_task() for managed_node2/package 13531 1726882455.46785: done queuing things up, now waiting for results queue to drain 13531 1726882455.46787: waiting for pending results... 13531 1726882455.47439: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13531 1726882455.47843: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000125 13531 1726882455.47866: variable 'ansible_search_path' from source: unknown 13531 1726882455.47875: variable 'ansible_search_path' from source: unknown 13531 1726882455.48031: calling self._execute() 13531 1726882455.48135: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882455.48239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882455.48253: variable 'omit' from source: magic vars 13531 1726882455.48963: variable 'ansible_distribution_major_version' from source: facts 13531 1726882455.48985: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882455.49177: variable 'network_state' from source: role '' defaults 13531 1726882455.49334: Evaluated conditional (network_state != {}): False 13531 1726882455.49342: when evaluation is False, skipping this task 13531 1726882455.49350: _execute() done 13531 1726882455.49358: dumping result to json 13531 1726882455.49367: done dumping result, returning 13531 1726882455.49380: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4fd9-519d-000000000125] 13531 1726882455.49392: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000125 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882455.49560: no more pending results, returning what we have 13531 1726882455.49566: results queue empty 13531 1726882455.49568: checking for any_errors_fatal 13531 1726882455.49575: done checking for any_errors_fatal 13531 1726882455.49576: checking for max_fail_percentage 13531 1726882455.49578: done checking for max_fail_percentage 13531 1726882455.49579: checking to see if all hosts have failed and the running result is not ok 13531 1726882455.49580: done checking to see if all hosts have failed 13531 1726882455.49581: getting the remaining hosts for this loop 13531 1726882455.49582: done getting the remaining hosts for this loop 13531 1726882455.49585: getting the next task for host managed_node2 13531 1726882455.49593: done getting next task for host managed_node2 13531 1726882455.49597: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13531 1726882455.49601: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882455.49627: getting variables 13531 1726882455.49629: in VariableManager get_vars() 13531 1726882455.49691: Calling all_inventory to load vars for managed_node2 13531 1726882455.49694: Calling groups_inventory to load vars for managed_node2 13531 1726882455.49697: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882455.49711: Calling all_plugins_play to load vars for managed_node2 13531 1726882455.49714: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882455.49717: Calling groups_plugins_play to load vars for managed_node2 13531 1726882455.50300: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000125 13531 1726882455.50303: WORKER PROCESS EXITING 13531 1726882455.58373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882455.60093: done with get_vars() 13531 1726882455.60134: done getting variables 13531 1726882455.60181: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:34:15 -0400 (0:00:00.143) 0:00:43.497 ****** 13531 1726882455.60210: entering _queue_task() for managed_node2/service 13531 1726882455.61001: worker is 1 (out of 1 available) 13531 1726882455.61014: exiting _queue_task() for managed_node2/service 13531 1726882455.61028: done queuing things up, now waiting for results queue to drain 13531 1726882455.61030: waiting for pending results... 13531 1726882455.62191: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13531 1726882455.62323: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000126 13531 1726882455.62335: variable 'ansible_search_path' from source: unknown 13531 1726882455.62339: variable 'ansible_search_path' from source: unknown 13531 1726882455.62387: calling self._execute() 13531 1726882455.62512: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882455.62523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882455.62528: variable 'omit' from source: magic vars 13531 1726882455.62975: variable 'ansible_distribution_major_version' from source: facts 13531 1726882455.62988: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882455.63170: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882455.63498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882455.66322: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882455.66413: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882455.66461: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882455.66506: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882455.66535: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882455.66622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.66657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.66696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.66739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.66756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.66811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.66838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.66867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.66920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.66939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.66990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.67023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.67052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.67102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.67126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.67319: variable 'network_connections' from source: task vars 13531 1726882455.67340: variable 'port1_profile' from source: play vars 13531 1726882455.67414: variable 'port1_profile' from source: play vars 13531 1726882455.67435: variable 'port2_profile' from source: play vars 13531 1726882455.67506: variable 'port2_profile' from source: play vars 13531 1726882455.67592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882455.67786: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882455.67826: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882455.67865: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882455.67901: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882455.67945: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882455.67977: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882455.68010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.68038: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882455.68100: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882455.68352: variable 'network_connections' from source: task vars 13531 1726882455.68362: variable 'port1_profile' from source: play vars 13531 1726882455.68439: variable 'port1_profile' from source: play vars 13531 1726882455.68451: variable 'port2_profile' from source: play vars 13531 1726882455.68519: variable 'port2_profile' from source: play vars 13531 1726882455.68551: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882455.68560: when evaluation is False, skipping this task 13531 1726882455.68569: _execute() done 13531 1726882455.68577: dumping result to json 13531 1726882455.68584: done dumping result, returning 13531 1726882455.68597: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-000000000126] 13531 1726882455.68622: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000126 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882455.68772: no more pending results, returning what we have 13531 1726882455.68777: results queue empty 13531 1726882455.68778: checking for any_errors_fatal 13531 1726882455.68786: done checking for any_errors_fatal 13531 1726882455.68787: checking for max_fail_percentage 13531 1726882455.68789: done checking for max_fail_percentage 13531 1726882455.68790: checking to see if all hosts have failed and the running result is not ok 13531 1726882455.68791: done checking to see if all hosts have failed 13531 1726882455.68791: getting the remaining hosts for this loop 13531 1726882455.68793: done getting the remaining hosts for this loop 13531 1726882455.68796: getting the next task for host managed_node2 13531 1726882455.68803: done getting next task for host managed_node2 13531 1726882455.68807: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13531 1726882455.68810: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882455.68830: getting variables 13531 1726882455.68832: in VariableManager get_vars() 13531 1726882455.68888: Calling all_inventory to load vars for managed_node2 13531 1726882455.68891: Calling groups_inventory to load vars for managed_node2 13531 1726882455.68893: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882455.68904: Calling all_plugins_play to load vars for managed_node2 13531 1726882455.68907: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882455.68909: Calling groups_plugins_play to load vars for managed_node2 13531 1726882455.69904: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000126 13531 1726882455.69907: WORKER PROCESS EXITING 13531 1726882455.70700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882455.73093: done with get_vars() 13531 1726882455.73127: done getting variables 13531 1726882455.73195: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:34:15 -0400 (0:00:00.130) 0:00:43.627 ****** 13531 1726882455.73232: entering _queue_task() for managed_node2/service 13531 1726882455.73569: worker is 1 (out of 1 available) 13531 1726882455.73589: exiting _queue_task() for managed_node2/service 13531 1726882455.73603: done queuing things up, now waiting for results queue to drain 13531 1726882455.73605: waiting for pending results... 13531 1726882455.74026: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13531 1726882455.74197: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000127 13531 1726882455.74208: variable 'ansible_search_path' from source: unknown 13531 1726882455.74217: variable 'ansible_search_path' from source: unknown 13531 1726882455.74257: calling self._execute() 13531 1726882455.74379: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882455.74382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882455.74397: variable 'omit' from source: magic vars 13531 1726882455.74847: variable 'ansible_distribution_major_version' from source: facts 13531 1726882455.74861: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882455.75074: variable 'network_provider' from source: set_fact 13531 1726882455.75082: variable 'network_state' from source: role '' defaults 13531 1726882455.75093: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13531 1726882455.75099: variable 'omit' from source: magic vars 13531 1726882455.75167: variable 'omit' from source: magic vars 13531 1726882455.75197: variable 'network_service_name' from source: role '' defaults 13531 1726882455.76042: variable 'network_service_name' from source: role '' defaults 13531 1726882455.76278: variable '__network_provider_setup' from source: role '' defaults 13531 1726882455.76284: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882455.76462: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882455.76473: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882455.76645: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882455.77106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882455.82420: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882455.82642: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882455.82686: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882455.82718: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882455.82797: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882455.82993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.83021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.83046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.83202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.83215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.83257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.83294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.83316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.83354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.83372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.83656: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13531 1726882455.83955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.83984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.84006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.84278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.84315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.84401: variable 'ansible_python' from source: facts 13531 1726882455.84423: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13531 1726882455.84710: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882455.84915: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882455.85169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.85190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.85214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.85373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.85388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.85436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882455.85582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882455.85606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.85646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882455.85666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882455.86072: variable 'network_connections' from source: task vars 13531 1726882455.86181: variable 'port1_profile' from source: play vars 13531 1726882455.87165: variable 'port1_profile' from source: play vars 13531 1726882455.87229: variable 'port2_profile' from source: play vars 13531 1726882455.87312: variable 'port2_profile' from source: play vars 13531 1726882455.87634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882455.88151: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882455.88321: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882455.88368: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882455.88561: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882455.88753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882455.88790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882455.88825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882455.88975: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882455.89030: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882455.89668: variable 'network_connections' from source: task vars 13531 1726882455.89672: variable 'port1_profile' from source: play vars 13531 1726882455.89872: variable 'port1_profile' from source: play vars 13531 1726882455.89948: variable 'port2_profile' from source: play vars 13531 1726882455.90105: variable 'port2_profile' from source: play vars 13531 1726882455.90140: variable '__network_packages_default_wireless' from source: role '' defaults 13531 1726882455.90337: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882455.90971: variable 'network_connections' from source: task vars 13531 1726882455.90976: variable 'port1_profile' from source: play vars 13531 1726882455.91163: variable 'port1_profile' from source: play vars 13531 1726882455.91172: variable 'port2_profile' from source: play vars 13531 1726882455.91351: variable 'port2_profile' from source: play vars 13531 1726882455.91382: variable '__network_packages_default_team' from source: role '' defaults 13531 1726882455.91575: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882455.92233: variable 'network_connections' from source: task vars 13531 1726882455.92236: variable 'port1_profile' from source: play vars 13531 1726882455.92309: variable 'port1_profile' from source: play vars 13531 1726882455.92316: variable 'port2_profile' from source: play vars 13531 1726882455.92497: variable 'port2_profile' from source: play vars 13531 1726882455.92672: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882455.92731: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882455.92737: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882455.92908: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882455.93425: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13531 1726882455.94420: variable 'network_connections' from source: task vars 13531 1726882455.94423: variable 'port1_profile' from source: play vars 13531 1726882455.94481: variable 'port1_profile' from source: play vars 13531 1726882455.94489: variable 'port2_profile' from source: play vars 13531 1726882455.94549: variable 'port2_profile' from source: play vars 13531 1726882455.94555: variable 'ansible_distribution' from source: facts 13531 1726882455.94565: variable '__network_rh_distros' from source: role '' defaults 13531 1726882455.94574: variable 'ansible_distribution_major_version' from source: facts 13531 1726882455.94588: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13531 1726882455.94885: variable 'ansible_distribution' from source: facts 13531 1726882455.94889: variable '__network_rh_distros' from source: role '' defaults 13531 1726882455.94893: variable 'ansible_distribution_major_version' from source: facts 13531 1726882455.94908: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13531 1726882455.95314: variable 'ansible_distribution' from source: facts 13531 1726882455.95317: variable '__network_rh_distros' from source: role '' defaults 13531 1726882455.95323: variable 'ansible_distribution_major_version' from source: facts 13531 1726882455.95369: variable 'network_provider' from source: set_fact 13531 1726882455.95508: variable 'omit' from source: magic vars 13531 1726882455.95541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882455.95574: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882455.95600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882455.95733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882455.95743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882455.95778: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882455.95782: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882455.95784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882455.96012: Set connection var ansible_pipelining to False 13531 1726882455.96018: Set connection var ansible_timeout to 10 13531 1726882455.96024: Set connection var ansible_shell_executable to /bin/sh 13531 1726882455.96029: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882455.96263: Set connection var ansible_connection to ssh 13531 1726882455.96269: Set connection var ansible_shell_type to sh 13531 1726882455.96300: variable 'ansible_shell_executable' from source: unknown 13531 1726882455.96303: variable 'ansible_connection' from source: unknown 13531 1726882455.96307: variable 'ansible_module_compression' from source: unknown 13531 1726882455.96309: variable 'ansible_shell_type' from source: unknown 13531 1726882455.96316: variable 'ansible_shell_executable' from source: unknown 13531 1726882455.96318: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882455.96320: variable 'ansible_pipelining' from source: unknown 13531 1726882455.96322: variable 'ansible_timeout' from source: unknown 13531 1726882455.96324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882455.96542: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882455.96553: variable 'omit' from source: magic vars 13531 1726882455.96562: starting attempt loop 13531 1726882455.96567: running the handler 13531 1726882455.96774: variable 'ansible_facts' from source: unknown 13531 1726882455.98512: _low_level_execute_command(): starting 13531 1726882455.98518: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882455.99770: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882455.99774: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882455.99777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882455.99779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882455.99781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882455.99783: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882455.99785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882455.99788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882455.99790: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882455.99792: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882455.99794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882455.99796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882455.99797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882455.99799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882455.99801: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882455.99802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882455.99809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882455.99811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882455.99813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882455.99917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882456.01574: stdout chunk (state=3): >>>/root <<< 13531 1726882456.01678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882456.01760: stderr chunk (state=3): >>><<< 13531 1726882456.01765: stdout chunk (state=3): >>><<< 13531 1726882456.01790: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882456.01803: _low_level_execute_command(): starting 13531 1726882456.01809: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882456.0179-15431-220389794605954 `" && echo ansible-tmp-1726882456.0179-15431-220389794605954="` echo /root/.ansible/tmp/ansible-tmp-1726882456.0179-15431-220389794605954 `" ) && sleep 0' 13531 1726882456.02458: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882456.02471: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882456.02482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882456.02496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882456.02534: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882456.02541: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882456.02551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882456.02571: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882456.02580: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882456.02585: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882456.02593: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882456.02603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882456.02615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882456.02622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882456.02629: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882456.02638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882456.02712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882456.02727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882456.02732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882456.03127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882456.05010: stdout chunk (state=3): >>>ansible-tmp-1726882456.0179-15431-220389794605954=/root/.ansible/tmp/ansible-tmp-1726882456.0179-15431-220389794605954 <<< 13531 1726882456.05014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882456.05016: stderr chunk (state=3): >>><<< 13531 1726882456.05018: stdout chunk (state=3): >>><<< 13531 1726882456.05036: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882456.0179-15431-220389794605954=/root/.ansible/tmp/ansible-tmp-1726882456.0179-15431-220389794605954 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882456.05074: variable 'ansible_module_compression' from source: unknown 13531 1726882456.05129: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13531 1726882456.05186: variable 'ansible_facts' from source: unknown 13531 1726882456.05391: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882456.0179-15431-220389794605954/AnsiballZ_systemd.py 13531 1726882456.05545: Sending initial data 13531 1726882456.05548: Sent initial data (153 bytes) 13531 1726882456.06935: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882456.06943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882456.06954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882456.06975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882456.07013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882456.07021: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882456.07030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882456.07044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882456.07051: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882456.07061: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882456.07074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882456.07083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882456.07094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882456.07103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882456.07110: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882456.07118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882456.07195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882456.07244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882456.07247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882456.07998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882456.09793: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882456.09889: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882456.09991: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpqncuwie6 /root/.ansible/tmp/ansible-tmp-1726882456.0179-15431-220389794605954/AnsiballZ_systemd.py <<< 13531 1726882456.10093: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882456.14341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882456.14494: stderr chunk (state=3): >>><<< 13531 1726882456.14498: stdout chunk (state=3): >>><<< 13531 1726882456.14515: done transferring module to remote 13531 1726882456.14526: _low_level_execute_command(): starting 13531 1726882456.14531: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882456.0179-15431-220389794605954/ /root/.ansible/tmp/ansible-tmp-1726882456.0179-15431-220389794605954/AnsiballZ_systemd.py && sleep 0' 13531 1726882456.15180: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882456.15195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882456.15211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882456.15229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882456.15276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882456.15290: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882456.15304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882456.15322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882456.15334: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882456.15346: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882456.15358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882456.15374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882456.15394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882456.15406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882456.15417: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882456.15430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882456.15508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882456.15525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882456.15540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882456.15689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882456.17484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882456.17560: stderr chunk (state=3): >>><<< 13531 1726882456.17566: stdout chunk (state=3): >>><<< 13531 1726882456.17656: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882456.17667: _low_level_execute_command(): starting 13531 1726882456.17671: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882456.0179-15431-220389794605954/AnsiballZ_systemd.py && sleep 0' 13531 1726882456.18228: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882456.18244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882456.18260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882456.18285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882456.18326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882456.18339: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882456.18353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882456.18375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882456.18388: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882456.18399: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882456.18411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882456.18425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882456.18440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882456.18452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882456.18467: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882456.18482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882456.18556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882456.18576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882456.18591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882456.18733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882456.43859: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 13531 1726882456.43894: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "9117696", "MemoryAvailable": "infinity", "CPUUsageNSec": "1110082000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13531 1726882456.45416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882456.45501: stderr chunk (state=3): >>><<< 13531 1726882456.45504: stdout chunk (state=3): >>><<< 13531 1726882456.45791: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9117696", "MemoryAvailable": "infinity", "CPUUsageNSec": "1110082000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882456.45802: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882456.0179-15431-220389794605954/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882456.45804: _low_level_execute_command(): starting 13531 1726882456.45807: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882456.0179-15431-220389794605954/ > /dev/null 2>&1 && sleep 0' 13531 1726882456.47012: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882456.47682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882456.47698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882456.47717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882456.47765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882456.47779: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882456.47793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882456.47811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882456.47824: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882456.47835: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882456.47847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882456.47861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882456.47884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882456.47897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882456.47909: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882456.47923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882456.47998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882456.48021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882456.48038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882456.48174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882456.50102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882456.50106: stdout chunk (state=3): >>><<< 13531 1726882456.50109: stderr chunk (state=3): >>><<< 13531 1726882456.50170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882456.50174: handler run complete 13531 1726882456.50269: attempt loop complete, returning result 13531 1726882456.50273: _execute() done 13531 1726882456.50275: dumping result to json 13531 1726882456.50277: done dumping result, returning 13531 1726882456.50280: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4fd9-519d-000000000127] 13531 1726882456.50282: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000127 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882456.50846: no more pending results, returning what we have 13531 1726882456.50848: results queue empty 13531 1726882456.50849: checking for any_errors_fatal 13531 1726882456.50852: done checking for any_errors_fatal 13531 1726882456.50853: checking for max_fail_percentage 13531 1726882456.50857: done checking for max_fail_percentage 13531 1726882456.50858: checking to see if all hosts have failed and the running result is not ok 13531 1726882456.50859: done checking to see if all hosts have failed 13531 1726882456.50859: getting the remaining hosts for this loop 13531 1726882456.50861: done getting the remaining hosts for this loop 13531 1726882456.50865: getting the next task for host managed_node2 13531 1726882456.50870: done getting next task for host managed_node2 13531 1726882456.50874: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13531 1726882456.50876: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882456.50886: getting variables 13531 1726882456.50887: in VariableManager get_vars() 13531 1726882456.50930: Calling all_inventory to load vars for managed_node2 13531 1726882456.50932: Calling groups_inventory to load vars for managed_node2 13531 1726882456.50935: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882456.50945: Calling all_plugins_play to load vars for managed_node2 13531 1726882456.50947: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882456.50949: Calling groups_plugins_play to load vars for managed_node2 13531 1726882456.51643: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000127 13531 1726882456.51647: WORKER PROCESS EXITING 13531 1726882456.53221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882456.56962: done with get_vars() 13531 1726882456.56994: done getting variables 13531 1726882456.57173: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:34:16 -0400 (0:00:00.839) 0:00:44.467 ****** 13531 1726882456.57211: entering _queue_task() for managed_node2/service 13531 1726882456.58807: worker is 1 (out of 1 available) 13531 1726882456.58819: exiting _queue_task() for managed_node2/service 13531 1726882456.58831: done queuing things up, now waiting for results queue to drain 13531 1726882456.58833: waiting for pending results... 13531 1726882456.59430: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13531 1726882456.59559: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000128 13531 1726882456.59586: variable 'ansible_search_path' from source: unknown 13531 1726882456.59590: variable 'ansible_search_path' from source: unknown 13531 1726882456.59630: calling self._execute() 13531 1726882456.59736: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882456.59741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882456.59752: variable 'omit' from source: magic vars 13531 1726882456.60179: variable 'ansible_distribution_major_version' from source: facts 13531 1726882456.60192: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882456.60317: variable 'network_provider' from source: set_fact 13531 1726882456.60329: Evaluated conditional (network_provider == "nm"): True 13531 1726882456.60422: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882456.60519: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882456.60707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882456.63085: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882456.63147: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882456.63195: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882456.63228: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882456.63254: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882456.63518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882456.63545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882456.63574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882456.63621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882456.63634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882456.63683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882456.63714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882456.63737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882456.63780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882456.63793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882456.63839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882456.63869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882456.63892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882456.63937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882456.63950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882456.64099: variable 'network_connections' from source: task vars 13531 1726882456.64110: variable 'port1_profile' from source: play vars 13531 1726882456.64190: variable 'port1_profile' from source: play vars 13531 1726882456.64200: variable 'port2_profile' from source: play vars 13531 1726882456.64268: variable 'port2_profile' from source: play vars 13531 1726882456.64333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882456.64508: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882456.64542: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882456.64580: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882456.64607: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882456.64645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882456.64671: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882456.64698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882456.64719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882456.64770: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882456.65046: variable 'network_connections' from source: task vars 13531 1726882456.65049: variable 'port1_profile' from source: play vars 13531 1726882456.65110: variable 'port1_profile' from source: play vars 13531 1726882456.65121: variable 'port2_profile' from source: play vars 13531 1726882456.65179: variable 'port2_profile' from source: play vars 13531 1726882456.65209: Evaluated conditional (__network_wpa_supplicant_required): False 13531 1726882456.65218: when evaluation is False, skipping this task 13531 1726882456.65228: _execute() done 13531 1726882456.65231: dumping result to json 13531 1726882456.65233: done dumping result, returning 13531 1726882456.65239: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4fd9-519d-000000000128] 13531 1726882456.65245: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000128 13531 1726882456.65344: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000128 13531 1726882456.65348: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13531 1726882456.65398: no more pending results, returning what we have 13531 1726882456.65403: results queue empty 13531 1726882456.65403: checking for any_errors_fatal 13531 1726882456.65421: done checking for any_errors_fatal 13531 1726882456.65422: checking for max_fail_percentage 13531 1726882456.65424: done checking for max_fail_percentage 13531 1726882456.65425: checking to see if all hosts have failed and the running result is not ok 13531 1726882456.65425: done checking to see if all hosts have failed 13531 1726882456.65426: getting the remaining hosts for this loop 13531 1726882456.65428: done getting the remaining hosts for this loop 13531 1726882456.65431: getting the next task for host managed_node2 13531 1726882456.65440: done getting next task for host managed_node2 13531 1726882456.65444: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13531 1726882456.65448: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882456.65470: getting variables 13531 1726882456.65472: in VariableManager get_vars() 13531 1726882456.65525: Calling all_inventory to load vars for managed_node2 13531 1726882456.65527: Calling groups_inventory to load vars for managed_node2 13531 1726882456.65530: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882456.65540: Calling all_plugins_play to load vars for managed_node2 13531 1726882456.65543: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882456.65546: Calling groups_plugins_play to load vars for managed_node2 13531 1726882456.67858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882456.69844: done with get_vars() 13531 1726882456.69884: done getting variables 13531 1726882456.69946: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:34:16 -0400 (0:00:00.127) 0:00:44.595 ****** 13531 1726882456.69987: entering _queue_task() for managed_node2/service 13531 1726882456.70381: worker is 1 (out of 1 available) 13531 1726882456.70398: exiting _queue_task() for managed_node2/service 13531 1726882456.70413: done queuing things up, now waiting for results queue to drain 13531 1726882456.70414: waiting for pending results... 13531 1726882456.70731: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 13531 1726882456.70885: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000129 13531 1726882456.70906: variable 'ansible_search_path' from source: unknown 13531 1726882456.70914: variable 'ansible_search_path' from source: unknown 13531 1726882456.70962: calling self._execute() 13531 1726882456.71100: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882456.71114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882456.71128: variable 'omit' from source: magic vars 13531 1726882456.71544: variable 'ansible_distribution_major_version' from source: facts 13531 1726882456.71562: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882456.71687: variable 'network_provider' from source: set_fact 13531 1726882456.71698: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882456.71706: when evaluation is False, skipping this task 13531 1726882456.71719: _execute() done 13531 1726882456.71730: dumping result to json 13531 1726882456.71738: done dumping result, returning 13531 1726882456.71748: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4fd9-519d-000000000129] 13531 1726882456.71759: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000129 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882456.71911: no more pending results, returning what we have 13531 1726882456.71915: results queue empty 13531 1726882456.71916: checking for any_errors_fatal 13531 1726882456.71925: done checking for any_errors_fatal 13531 1726882456.71926: checking for max_fail_percentage 13531 1726882456.71928: done checking for max_fail_percentage 13531 1726882456.71929: checking to see if all hosts have failed and the running result is not ok 13531 1726882456.71930: done checking to see if all hosts have failed 13531 1726882456.71931: getting the remaining hosts for this loop 13531 1726882456.71933: done getting the remaining hosts for this loop 13531 1726882456.71936: getting the next task for host managed_node2 13531 1726882456.71942: done getting next task for host managed_node2 13531 1726882456.71947: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13531 1726882456.71951: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882456.71976: getting variables 13531 1726882456.71978: in VariableManager get_vars() 13531 1726882456.72034: Calling all_inventory to load vars for managed_node2 13531 1726882456.72037: Calling groups_inventory to load vars for managed_node2 13531 1726882456.72040: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882456.72053: Calling all_plugins_play to load vars for managed_node2 13531 1726882456.72056: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882456.72060: Calling groups_plugins_play to load vars for managed_node2 13531 1726882456.73134: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000129 13531 1726882456.73137: WORKER PROCESS EXITING 13531 1726882456.73927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882456.76040: done with get_vars() 13531 1726882456.76186: done getting variables 13531 1726882456.76240: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:34:16 -0400 (0:00:00.063) 0:00:44.658 ****** 13531 1726882456.76379: entering _queue_task() for managed_node2/copy 13531 1726882456.76878: worker is 1 (out of 1 available) 13531 1726882456.76899: exiting _queue_task() for managed_node2/copy 13531 1726882456.76913: done queuing things up, now waiting for results queue to drain 13531 1726882456.76914: waiting for pending results... 13531 1726882456.77235: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13531 1726882456.77390: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000012a 13531 1726882456.77416: variable 'ansible_search_path' from source: unknown 13531 1726882456.77425: variable 'ansible_search_path' from source: unknown 13531 1726882456.77474: calling self._execute() 13531 1726882456.77585: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882456.77597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882456.77611: variable 'omit' from source: magic vars 13531 1726882456.78025: variable 'ansible_distribution_major_version' from source: facts 13531 1726882456.78042: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882456.78172: variable 'network_provider' from source: set_fact 13531 1726882456.78184: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882456.78193: when evaluation is False, skipping this task 13531 1726882456.78201: _execute() done 13531 1726882456.78208: dumping result to json 13531 1726882456.78214: done dumping result, returning 13531 1726882456.78231: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4fd9-519d-00000000012a] 13531 1726882456.78243: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000012a skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13531 1726882456.78404: no more pending results, returning what we have 13531 1726882456.78409: results queue empty 13531 1726882456.78410: checking for any_errors_fatal 13531 1726882456.78416: done checking for any_errors_fatal 13531 1726882456.78417: checking for max_fail_percentage 13531 1726882456.78419: done checking for max_fail_percentage 13531 1726882456.78420: checking to see if all hosts have failed and the running result is not ok 13531 1726882456.78420: done checking to see if all hosts have failed 13531 1726882456.78421: getting the remaining hosts for this loop 13531 1726882456.78423: done getting the remaining hosts for this loop 13531 1726882456.78427: getting the next task for host managed_node2 13531 1726882456.78434: done getting next task for host managed_node2 13531 1726882456.78438: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13531 1726882456.78442: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882456.78465: getting variables 13531 1726882456.78468: in VariableManager get_vars() 13531 1726882456.78526: Calling all_inventory to load vars for managed_node2 13531 1726882456.78530: Calling groups_inventory to load vars for managed_node2 13531 1726882456.78532: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882456.78546: Calling all_plugins_play to load vars for managed_node2 13531 1726882456.78550: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882456.78553: Calling groups_plugins_play to load vars for managed_node2 13531 1726882456.79668: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000012a 13531 1726882456.79674: WORKER PROCESS EXITING 13531 1726882456.80637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882456.83193: done with get_vars() 13531 1726882456.83225: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:34:16 -0400 (0:00:00.069) 0:00:44.728 ****** 13531 1726882456.83319: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 13531 1726882456.83659: worker is 1 (out of 1 available) 13531 1726882456.83674: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 13531 1726882456.83690: done queuing things up, now waiting for results queue to drain 13531 1726882456.83696: waiting for pending results... 13531 1726882456.84022: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13531 1726882456.84194: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000012b 13531 1726882456.84214: variable 'ansible_search_path' from source: unknown 13531 1726882456.84223: variable 'ansible_search_path' from source: unknown 13531 1726882456.84275: calling self._execute() 13531 1726882456.84386: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882456.84398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882456.84412: variable 'omit' from source: magic vars 13531 1726882456.84839: variable 'ansible_distribution_major_version' from source: facts 13531 1726882456.84859: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882456.84873: variable 'omit' from source: magic vars 13531 1726882456.84944: variable 'omit' from source: magic vars 13531 1726882456.85120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882456.87588: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882456.87670: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882456.87714: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882456.87762: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882456.87804: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882456.87908: variable 'network_provider' from source: set_fact 13531 1726882456.88167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882456.88349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882456.88384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882456.88433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882456.88449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882456.88533: variable 'omit' from source: magic vars 13531 1726882456.88771: variable 'omit' from source: magic vars 13531 1726882456.88941: variable 'network_connections' from source: task vars 13531 1726882456.89079: variable 'port1_profile' from source: play vars 13531 1726882456.89143: variable 'port1_profile' from source: play vars 13531 1726882456.89294: variable 'port2_profile' from source: play vars 13531 1726882456.89357: variable 'port2_profile' from source: play vars 13531 1726882456.89634: variable 'omit' from source: magic vars 13531 1726882456.89703: variable '__lsr_ansible_managed' from source: task vars 13531 1726882456.89773: variable '__lsr_ansible_managed' from source: task vars 13531 1726882456.90366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13531 1726882456.90635: Loaded config def from plugin (lookup/template) 13531 1726882456.90645: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13531 1726882456.90684: File lookup term: get_ansible_managed.j2 13531 1726882456.90691: variable 'ansible_search_path' from source: unknown 13531 1726882456.90699: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13531 1726882456.90722: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13531 1726882456.90743: variable 'ansible_search_path' from source: unknown 13531 1726882456.99712: variable 'ansible_managed' from source: unknown 13531 1726882456.99888: variable 'omit' from source: magic vars 13531 1726882456.99926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882456.99967: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882456.99993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882457.00016: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882457.00031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882457.00073: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882457.00081: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882457.00089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882457.00202: Set connection var ansible_pipelining to False 13531 1726882457.00213: Set connection var ansible_timeout to 10 13531 1726882457.00223: Set connection var ansible_shell_executable to /bin/sh 13531 1726882457.00232: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882457.00238: Set connection var ansible_connection to ssh 13531 1726882457.00244: Set connection var ansible_shell_type to sh 13531 1726882457.00284: variable 'ansible_shell_executable' from source: unknown 13531 1726882457.00292: variable 'ansible_connection' from source: unknown 13531 1726882457.00300: variable 'ansible_module_compression' from source: unknown 13531 1726882457.00306: variable 'ansible_shell_type' from source: unknown 13531 1726882457.00313: variable 'ansible_shell_executable' from source: unknown 13531 1726882457.00319: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882457.00326: variable 'ansible_pipelining' from source: unknown 13531 1726882457.00333: variable 'ansible_timeout' from source: unknown 13531 1726882457.00340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882457.00486: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882457.00512: variable 'omit' from source: magic vars 13531 1726882457.00524: starting attempt loop 13531 1726882457.00531: running the handler 13531 1726882457.00549: _low_level_execute_command(): starting 13531 1726882457.00559: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882457.01334: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882457.01356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882457.01378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882457.01398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882457.01441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882457.01454: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882457.01477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882457.01496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882457.01508: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882457.01520: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882457.01533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882457.01548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882457.01567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882457.01584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882457.01597: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882457.01612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882457.01694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882457.01718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882457.01736: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882457.01879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882457.03538: stdout chunk (state=3): >>>/root <<< 13531 1726882457.03640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882457.03729: stderr chunk (state=3): >>><<< 13531 1726882457.03742: stdout chunk (state=3): >>><<< 13531 1726882457.03861: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882457.03867: _low_level_execute_command(): starting 13531 1726882457.03870: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882457.037745-15483-279745125500516 `" && echo ansible-tmp-1726882457.037745-15483-279745125500516="` echo /root/.ansible/tmp/ansible-tmp-1726882457.037745-15483-279745125500516 `" ) && sleep 0' 13531 1726882457.04445: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882457.04459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882457.04481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882457.04500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882457.04548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882457.04561: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882457.04578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882457.04596: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882457.04608: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882457.04623: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882457.04636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882457.04650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882457.04669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882457.04682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882457.04693: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882457.04707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882457.04789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882457.04813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882457.04830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882457.04974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882457.06849: stdout chunk (state=3): >>>ansible-tmp-1726882457.037745-15483-279745125500516=/root/.ansible/tmp/ansible-tmp-1726882457.037745-15483-279745125500516 <<< 13531 1726882457.06965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882457.07051: stderr chunk (state=3): >>><<< 13531 1726882457.07065: stdout chunk (state=3): >>><<< 13531 1726882457.07473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882457.037745-15483-279745125500516=/root/.ansible/tmp/ansible-tmp-1726882457.037745-15483-279745125500516 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882457.07482: variable 'ansible_module_compression' from source: unknown 13531 1726882457.07484: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13531 1726882457.07486: variable 'ansible_facts' from source: unknown 13531 1726882457.07488: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882457.037745-15483-279745125500516/AnsiballZ_network_connections.py 13531 1726882457.07772: Sending initial data 13531 1726882457.07776: Sent initial data (167 bytes) 13531 1726882457.08551: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882457.08562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882457.08575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882457.08589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882457.08641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882457.08644: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882457.08646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882457.08653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882457.08666: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882457.08678: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882457.08686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882457.08695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882457.08706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882457.08713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882457.08719: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882457.08728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882457.08801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882457.08815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882457.08825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882457.08953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882457.10701: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882457.10797: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882457.10896: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpercvtaeg /root/.ansible/tmp/ansible-tmp-1726882457.037745-15483-279745125500516/AnsiballZ_network_connections.py <<< 13531 1726882457.10991: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882457.12685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882457.12965: stderr chunk (state=3): >>><<< 13531 1726882457.12979: stdout chunk (state=3): >>><<< 13531 1726882457.12983: done transferring module to remote 13531 1726882457.12985: _low_level_execute_command(): starting 13531 1726882457.12988: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882457.037745-15483-279745125500516/ /root/.ansible/tmp/ansible-tmp-1726882457.037745-15483-279745125500516/AnsiballZ_network_connections.py && sleep 0' 13531 1726882457.13614: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882457.13634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882457.13654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882457.13676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882457.13719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882457.13732: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882457.13755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882457.13780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882457.13793: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882457.13804: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882457.13815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882457.13828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882457.13843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882457.13864: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882457.13879: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882457.13894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882457.13974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882457.13998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882457.14016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882457.14146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882457.16015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882457.16262: stderr chunk (state=3): >>><<< 13531 1726882457.16267: stdout chunk (state=3): >>><<< 13531 1726882457.16359: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882457.16372: _low_level_execute_command(): starting 13531 1726882457.16375: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882457.037745-15483-279745125500516/AnsiballZ_network_connections.py && sleep 0' 13531 1726882457.16971: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882457.16987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882457.17002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882457.17020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882457.17073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882457.17086: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882457.17101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882457.17119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882457.17131: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882457.17148: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882457.17161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882457.17177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882457.17193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882457.17205: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882457.17216: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882457.17229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882457.17311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882457.17333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882457.17351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882457.17502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882457.61079: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ycnyd6fm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ycnyd6fm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/207956e5-5781-4b3b-a739-18944f85bf50: error=unknown <<< 13531 1726882457.63449: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ycnyd6fm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ycnyd6fm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/5dda56b8-72f2-4584-944c-6391c4eeec78: error=unknown <<< 13531 1726882457.63670: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13531 1726882457.65370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882457.65375: stdout chunk (state=3): >>><<< 13531 1726882457.65388: stderr chunk (state=3): >>><<< 13531 1726882457.65545: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ycnyd6fm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ycnyd6fm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/207956e5-5781-4b3b-a739-18944f85bf50: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ycnyd6fm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ycnyd6fm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/5dda56b8-72f2-4584-944c-6391c4eeec78: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882457.65556: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882457.037745-15483-279745125500516/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882457.65560: _low_level_execute_command(): starting 13531 1726882457.65564: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882457.037745-15483-279745125500516/ > /dev/null 2>&1 && sleep 0' 13531 1726882457.67090: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882457.67234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882457.67247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882457.67261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882457.67302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882457.67308: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882457.67318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882457.67332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882457.67341: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882457.67356: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882457.67361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882457.67374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882457.67385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882457.67392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882457.67398: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882457.67407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882457.67598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882457.67613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882457.67618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882457.67794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882457.69708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882457.69711: stdout chunk (state=3): >>><<< 13531 1726882457.69718: stderr chunk (state=3): >>><<< 13531 1726882457.69737: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882457.69743: handler run complete 13531 1726882457.69776: attempt loop complete, returning result 13531 1726882457.69779: _execute() done 13531 1726882457.69781: dumping result to json 13531 1726882457.69785: done dumping result, returning 13531 1726882457.69796: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4fd9-519d-00000000012b] 13531 1726882457.69802: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000012b 13531 1726882457.69919: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000012b 13531 1726882457.69922: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0.1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13531 1726882457.70080: no more pending results, returning what we have 13531 1726882457.70084: results queue empty 13531 1726882457.70085: checking for any_errors_fatal 13531 1726882457.70090: done checking for any_errors_fatal 13531 1726882457.70091: checking for max_fail_percentage 13531 1726882457.70093: done checking for max_fail_percentage 13531 1726882457.70094: checking to see if all hosts have failed and the running result is not ok 13531 1726882457.70095: done checking to see if all hosts have failed 13531 1726882457.70096: getting the remaining hosts for this loop 13531 1726882457.70097: done getting the remaining hosts for this loop 13531 1726882457.70101: getting the next task for host managed_node2 13531 1726882457.70107: done getting next task for host managed_node2 13531 1726882457.70111: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13531 1726882457.70114: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882457.70124: getting variables 13531 1726882457.70126: in VariableManager get_vars() 13531 1726882457.70182: Calling all_inventory to load vars for managed_node2 13531 1726882457.70185: Calling groups_inventory to load vars for managed_node2 13531 1726882457.70187: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882457.70203: Calling all_plugins_play to load vars for managed_node2 13531 1726882457.70206: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882457.70208: Calling groups_plugins_play to load vars for managed_node2 13531 1726882457.73838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882457.77562: done with get_vars() 13531 1726882457.77600: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:34:17 -0400 (0:00:00.943) 0:00:45.672 ****** 13531 1726882457.77688: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 13531 1726882457.78038: worker is 1 (out of 1 available) 13531 1726882457.78053: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 13531 1726882457.78073: done queuing things up, now waiting for results queue to drain 13531 1726882457.78074: waiting for pending results... 13531 1726882457.78389: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 13531 1726882457.78544: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000012c 13531 1726882457.78571: variable 'ansible_search_path' from source: unknown 13531 1726882457.78586: variable 'ansible_search_path' from source: unknown 13531 1726882457.78635: calling self._execute() 13531 1726882457.78752: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882457.78770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882457.78785: variable 'omit' from source: magic vars 13531 1726882457.79332: variable 'ansible_distribution_major_version' from source: facts 13531 1726882457.79356: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882457.79493: variable 'network_state' from source: role '' defaults 13531 1726882457.79508: Evaluated conditional (network_state != {}): False 13531 1726882457.79516: when evaluation is False, skipping this task 13531 1726882457.79523: _execute() done 13531 1726882457.79529: dumping result to json 13531 1726882457.79535: done dumping result, returning 13531 1726882457.79547: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4fd9-519d-00000000012c] 13531 1726882457.79561: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000012c skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882457.79729: no more pending results, returning what we have 13531 1726882457.79734: results queue empty 13531 1726882457.79735: checking for any_errors_fatal 13531 1726882457.79747: done checking for any_errors_fatal 13531 1726882457.79748: checking for max_fail_percentage 13531 1726882457.79750: done checking for max_fail_percentage 13531 1726882457.79751: checking to see if all hosts have failed and the running result is not ok 13531 1726882457.79752: done checking to see if all hosts have failed 13531 1726882457.79753: getting the remaining hosts for this loop 13531 1726882457.79757: done getting the remaining hosts for this loop 13531 1726882457.79761: getting the next task for host managed_node2 13531 1726882457.79769: done getting next task for host managed_node2 13531 1726882457.79773: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13531 1726882457.79777: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882457.79799: getting variables 13531 1726882457.79800: in VariableManager get_vars() 13531 1726882457.79859: Calling all_inventory to load vars for managed_node2 13531 1726882457.79862: Calling groups_inventory to load vars for managed_node2 13531 1726882457.79867: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882457.79879: Calling all_plugins_play to load vars for managed_node2 13531 1726882457.79883: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882457.79886: Calling groups_plugins_play to load vars for managed_node2 13531 1726882457.80913: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000012c 13531 1726882457.80917: WORKER PROCESS EXITING 13531 1726882457.82481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882457.84415: done with get_vars() 13531 1726882457.84450: done getting variables 13531 1726882457.84517: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:34:17 -0400 (0:00:00.068) 0:00:45.740 ****** 13531 1726882457.84560: entering _queue_task() for managed_node2/debug 13531 1726882457.84904: worker is 1 (out of 1 available) 13531 1726882457.84914: exiting _queue_task() for managed_node2/debug 13531 1726882457.84926: done queuing things up, now waiting for results queue to drain 13531 1726882457.84927: waiting for pending results... 13531 1726882457.85844: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13531 1726882457.86252: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000012d 13531 1726882457.86309: variable 'ansible_search_path' from source: unknown 13531 1726882457.86312: variable 'ansible_search_path' from source: unknown 13531 1726882457.86428: calling self._execute() 13531 1726882457.86557: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882457.86563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882457.86569: variable 'omit' from source: magic vars 13531 1726882457.87040: variable 'ansible_distribution_major_version' from source: facts 13531 1726882457.87044: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882457.87047: variable 'omit' from source: magic vars 13531 1726882457.87387: variable 'omit' from source: magic vars 13531 1726882457.87426: variable 'omit' from source: magic vars 13531 1726882457.87478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882457.87518: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882457.87542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882457.87570: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882457.87585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882457.87624: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882457.87633: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882457.87641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882457.87756: Set connection var ansible_pipelining to False 13531 1726882457.87770: Set connection var ansible_timeout to 10 13531 1726882457.87780: Set connection var ansible_shell_executable to /bin/sh 13531 1726882457.87788: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882457.87795: Set connection var ansible_connection to ssh 13531 1726882457.87802: Set connection var ansible_shell_type to sh 13531 1726882457.87837: variable 'ansible_shell_executable' from source: unknown 13531 1726882457.87844: variable 'ansible_connection' from source: unknown 13531 1726882457.87851: variable 'ansible_module_compression' from source: unknown 13531 1726882457.87860: variable 'ansible_shell_type' from source: unknown 13531 1726882457.87869: variable 'ansible_shell_executable' from source: unknown 13531 1726882457.87875: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882457.87882: variable 'ansible_pipelining' from source: unknown 13531 1726882457.87888: variable 'ansible_timeout' from source: unknown 13531 1726882457.87895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882457.88039: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882457.88061: variable 'omit' from source: magic vars 13531 1726882457.88075: starting attempt loop 13531 1726882457.88082: running the handler 13531 1726882457.88222: variable '__network_connections_result' from source: set_fact 13531 1726882457.88329: handler run complete 13531 1726882457.88356: attempt loop complete, returning result 13531 1726882457.88395: _execute() done 13531 1726882457.88405: dumping result to json 13531 1726882457.88412: done dumping result, returning 13531 1726882457.88425: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4fd9-519d-00000000012d] 13531 1726882457.88435: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000012d ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 13531 1726882457.89226: no more pending results, returning what we have 13531 1726882457.89230: results queue empty 13531 1726882457.89232: checking for any_errors_fatal 13531 1726882457.89244: done checking for any_errors_fatal 13531 1726882457.89245: checking for max_fail_percentage 13531 1726882457.89247: done checking for max_fail_percentage 13531 1726882457.89248: checking to see if all hosts have failed and the running result is not ok 13531 1726882457.89249: done checking to see if all hosts have failed 13531 1726882457.89250: getting the remaining hosts for this loop 13531 1726882457.89251: done getting the remaining hosts for this loop 13531 1726882457.89257: getting the next task for host managed_node2 13531 1726882457.89266: done getting next task for host managed_node2 13531 1726882457.89271: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13531 1726882457.89274: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882457.89286: getting variables 13531 1726882457.89288: in VariableManager get_vars() 13531 1726882457.89347: Calling all_inventory to load vars for managed_node2 13531 1726882457.89350: Calling groups_inventory to load vars for managed_node2 13531 1726882457.89352: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882457.89368: Calling all_plugins_play to load vars for managed_node2 13531 1726882457.89371: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882457.89375: Calling groups_plugins_play to load vars for managed_node2 13531 1726882457.91073: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000012d 13531 1726882457.91077: WORKER PROCESS EXITING 13531 1726882457.92168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882457.93986: done with get_vars() 13531 1726882457.94014: done getting variables 13531 1726882457.94082: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:34:17 -0400 (0:00:00.095) 0:00:45.836 ****** 13531 1726882457.94118: entering _queue_task() for managed_node2/debug 13531 1726882457.94729: worker is 1 (out of 1 available) 13531 1726882457.94741: exiting _queue_task() for managed_node2/debug 13531 1726882457.94753: done queuing things up, now waiting for results queue to drain 13531 1726882457.94755: waiting for pending results... 13531 1726882457.95078: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13531 1726882457.95244: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000012e 13531 1726882457.95266: variable 'ansible_search_path' from source: unknown 13531 1726882457.95275: variable 'ansible_search_path' from source: unknown 13531 1726882457.95324: calling self._execute() 13531 1726882457.95428: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882457.95439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882457.95452: variable 'omit' from source: magic vars 13531 1726882457.95838: variable 'ansible_distribution_major_version' from source: facts 13531 1726882457.95864: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882457.95876: variable 'omit' from source: magic vars 13531 1726882457.95933: variable 'omit' from source: magic vars 13531 1726882457.95979: variable 'omit' from source: magic vars 13531 1726882457.96028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882457.96076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882457.96105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882457.96127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882457.96143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882457.96184: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882457.96195: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882457.96202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882457.96315: Set connection var ansible_pipelining to False 13531 1726882457.96327: Set connection var ansible_timeout to 10 13531 1726882457.96336: Set connection var ansible_shell_executable to /bin/sh 13531 1726882457.96346: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882457.96352: Set connection var ansible_connection to ssh 13531 1726882457.96359: Set connection var ansible_shell_type to sh 13531 1726882457.96393: variable 'ansible_shell_executable' from source: unknown 13531 1726882457.96406: variable 'ansible_connection' from source: unknown 13531 1726882457.96414: variable 'ansible_module_compression' from source: unknown 13531 1726882457.96420: variable 'ansible_shell_type' from source: unknown 13531 1726882457.96426: variable 'ansible_shell_executable' from source: unknown 13531 1726882457.96431: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882457.96438: variable 'ansible_pipelining' from source: unknown 13531 1726882457.96443: variable 'ansible_timeout' from source: unknown 13531 1726882457.96450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882457.96594: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882457.96615: variable 'omit' from source: magic vars 13531 1726882457.96633: starting attempt loop 13531 1726882457.96640: running the handler 13531 1726882457.96694: variable '__network_connections_result' from source: set_fact 13531 1726882457.96770: variable '__network_connections_result' from source: set_fact 13531 1726882457.96884: handler run complete 13531 1726882457.96912: attempt loop complete, returning result 13531 1726882457.96918: _execute() done 13531 1726882457.96924: dumping result to json 13531 1726882457.96930: done dumping result, returning 13531 1726882457.96941: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4fd9-519d-00000000012e] 13531 1726882457.96959: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000012e ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0.1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13531 1726882457.97171: no more pending results, returning what we have 13531 1726882457.97175: results queue empty 13531 1726882457.97176: checking for any_errors_fatal 13531 1726882457.97184: done checking for any_errors_fatal 13531 1726882457.97185: checking for max_fail_percentage 13531 1726882457.97188: done checking for max_fail_percentage 13531 1726882457.97189: checking to see if all hosts have failed and the running result is not ok 13531 1726882457.97190: done checking to see if all hosts have failed 13531 1726882457.97190: getting the remaining hosts for this loop 13531 1726882457.97192: done getting the remaining hosts for this loop 13531 1726882457.97196: getting the next task for host managed_node2 13531 1726882457.97202: done getting next task for host managed_node2 13531 1726882457.97207: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13531 1726882457.97211: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882457.97222: getting variables 13531 1726882457.97223: in VariableManager get_vars() 13531 1726882457.97311: Calling all_inventory to load vars for managed_node2 13531 1726882457.97315: Calling groups_inventory to load vars for managed_node2 13531 1726882457.97317: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882457.97328: Calling all_plugins_play to load vars for managed_node2 13531 1726882457.97332: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882457.97335: Calling groups_plugins_play to load vars for managed_node2 13531 1726882457.98390: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000012e 13531 1726882457.98394: WORKER PROCESS EXITING 13531 1726882457.99381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882458.02806: done with get_vars() 13531 1726882458.02844: done getting variables 13531 1726882458.02911: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:34:18 -0400 (0:00:00.088) 0:00:45.924 ****** 13531 1726882458.02950: entering _queue_task() for managed_node2/debug 13531 1726882458.03374: worker is 1 (out of 1 available) 13531 1726882458.03386: exiting _queue_task() for managed_node2/debug 13531 1726882458.03429: done queuing things up, now waiting for results queue to drain 13531 1726882458.03431: waiting for pending results... 13531 1726882458.03741: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13531 1726882458.03896: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000012f 13531 1726882458.03915: variable 'ansible_search_path' from source: unknown 13531 1726882458.03923: variable 'ansible_search_path' from source: unknown 13531 1726882458.03972: calling self._execute() 13531 1726882458.04080: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882458.04096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882458.04109: variable 'omit' from source: magic vars 13531 1726882458.04512: variable 'ansible_distribution_major_version' from source: facts 13531 1726882458.04534: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882458.04666: variable 'network_state' from source: role '' defaults 13531 1726882458.04681: Evaluated conditional (network_state != {}): False 13531 1726882458.04688: when evaluation is False, skipping this task 13531 1726882458.04694: _execute() done 13531 1726882458.04701: dumping result to json 13531 1726882458.04711: done dumping result, returning 13531 1726882458.04722: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4fd9-519d-00000000012f] 13531 1726882458.04732: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000012f 13531 1726882458.04843: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000012f skipping: [managed_node2] => { "false_condition": "network_state != {}" } 13531 1726882458.04896: no more pending results, returning what we have 13531 1726882458.04901: results queue empty 13531 1726882458.04902: checking for any_errors_fatal 13531 1726882458.04912: done checking for any_errors_fatal 13531 1726882458.04913: checking for max_fail_percentage 13531 1726882458.04915: done checking for max_fail_percentage 13531 1726882458.04917: checking to see if all hosts have failed and the running result is not ok 13531 1726882458.04918: done checking to see if all hosts have failed 13531 1726882458.04918: getting the remaining hosts for this loop 13531 1726882458.04920: done getting the remaining hosts for this loop 13531 1726882458.04924: getting the next task for host managed_node2 13531 1726882458.04931: done getting next task for host managed_node2 13531 1726882458.04936: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13531 1726882458.04943: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882458.04967: getting variables 13531 1726882458.04969: in VariableManager get_vars() 13531 1726882458.05027: Calling all_inventory to load vars for managed_node2 13531 1726882458.05030: Calling groups_inventory to load vars for managed_node2 13531 1726882458.05033: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882458.05045: Calling all_plugins_play to load vars for managed_node2 13531 1726882458.05048: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882458.05051: Calling groups_plugins_play to load vars for managed_node2 13531 1726882458.06004: WORKER PROCESS EXITING 13531 1726882458.06981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882458.08721: done with get_vars() 13531 1726882458.08760: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:34:18 -0400 (0:00:00.059) 0:00:45.983 ****** 13531 1726882458.08861: entering _queue_task() for managed_node2/ping 13531 1726882458.09204: worker is 1 (out of 1 available) 13531 1726882458.09217: exiting _queue_task() for managed_node2/ping 13531 1726882458.09231: done queuing things up, now waiting for results queue to drain 13531 1726882458.09232: waiting for pending results... 13531 1726882458.09551: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 13531 1726882458.09710: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000130 13531 1726882458.09735: variable 'ansible_search_path' from source: unknown 13531 1726882458.09743: variable 'ansible_search_path' from source: unknown 13531 1726882458.09791: calling self._execute() 13531 1726882458.09903: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882458.09913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882458.09925: variable 'omit' from source: magic vars 13531 1726882458.10330: variable 'ansible_distribution_major_version' from source: facts 13531 1726882458.10349: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882458.10359: variable 'omit' from source: magic vars 13531 1726882458.10422: variable 'omit' from source: magic vars 13531 1726882458.10467: variable 'omit' from source: magic vars 13531 1726882458.10521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882458.10566: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882458.10596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882458.10618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882458.10633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882458.10675: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882458.10684: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882458.10693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882458.10799: Set connection var ansible_pipelining to False 13531 1726882458.10816: Set connection var ansible_timeout to 10 13531 1726882458.10826: Set connection var ansible_shell_executable to /bin/sh 13531 1726882458.10835: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882458.10841: Set connection var ansible_connection to ssh 13531 1726882458.10847: Set connection var ansible_shell_type to sh 13531 1726882458.10883: variable 'ansible_shell_executable' from source: unknown 13531 1726882458.10890: variable 'ansible_connection' from source: unknown 13531 1726882458.10897: variable 'ansible_module_compression' from source: unknown 13531 1726882458.10902: variable 'ansible_shell_type' from source: unknown 13531 1726882458.10907: variable 'ansible_shell_executable' from source: unknown 13531 1726882458.10917: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882458.10926: variable 'ansible_pipelining' from source: unknown 13531 1726882458.10931: variable 'ansible_timeout' from source: unknown 13531 1726882458.10937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882458.11141: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882458.11159: variable 'omit' from source: magic vars 13531 1726882458.11171: starting attempt loop 13531 1726882458.11177: running the handler 13531 1726882458.11200: _low_level_execute_command(): starting 13531 1726882458.11211: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882458.12002: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882458.12023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882458.12039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882458.12068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882458.12115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882458.12130: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882458.12145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.12166: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882458.12181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882458.12191: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882458.12202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882458.12213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882458.12231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882458.12243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882458.12253: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882458.12271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.12352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882458.12378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882458.12399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882458.12536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882458.14199: stdout chunk (state=3): >>>/root <<< 13531 1726882458.14401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882458.14405: stdout chunk (state=3): >>><<< 13531 1726882458.14407: stderr chunk (state=3): >>><<< 13531 1726882458.14534: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882458.14538: _low_level_execute_command(): starting 13531 1726882458.14542: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882458.1442947-15544-256772003026131 `" && echo ansible-tmp-1726882458.1442947-15544-256772003026131="` echo /root/.ansible/tmp/ansible-tmp-1726882458.1442947-15544-256772003026131 `" ) && sleep 0' 13531 1726882458.15118: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882458.15132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882458.15146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882458.15167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882458.15217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882458.15230: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882458.15244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.15260: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882458.15274: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882458.15289: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882458.15304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882458.15317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882458.15332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882458.15343: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882458.15354: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882458.15369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.15447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882458.15471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882458.15488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882458.15622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882458.17520: stdout chunk (state=3): >>>ansible-tmp-1726882458.1442947-15544-256772003026131=/root/.ansible/tmp/ansible-tmp-1726882458.1442947-15544-256772003026131 <<< 13531 1726882458.17722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882458.17726: stdout chunk (state=3): >>><<< 13531 1726882458.17728: stderr chunk (state=3): >>><<< 13531 1726882458.18058: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882458.1442947-15544-256772003026131=/root/.ansible/tmp/ansible-tmp-1726882458.1442947-15544-256772003026131 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882458.18061: variable 'ansible_module_compression' from source: unknown 13531 1726882458.18066: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13531 1726882458.18068: variable 'ansible_facts' from source: unknown 13531 1726882458.18070: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882458.1442947-15544-256772003026131/AnsiballZ_ping.py 13531 1726882458.18432: Sending initial data 13531 1726882458.18435: Sent initial data (153 bytes) 13531 1726882458.20713: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882458.20717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882458.20737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882458.20741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.20843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882458.20846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882458.20929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882458.22726: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882458.22822: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882458.22923: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmp1zsoabjx /root/.ansible/tmp/ansible-tmp-1726882458.1442947-15544-256772003026131/AnsiballZ_ping.py <<< 13531 1726882458.23016: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882458.24543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882458.24547: stderr chunk (state=3): >>><<< 13531 1726882458.24550: stdout chunk (state=3): >>><<< 13531 1726882458.24579: done transferring module to remote 13531 1726882458.24589: _low_level_execute_command(): starting 13531 1726882458.24594: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882458.1442947-15544-256772003026131/ /root/.ansible/tmp/ansible-tmp-1726882458.1442947-15544-256772003026131/AnsiballZ_ping.py && sleep 0' 13531 1726882458.25479: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882458.25482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882458.25527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.25539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882458.25541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.25593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882458.25600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882458.25610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882458.25720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882458.27551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882458.27555: stderr chunk (state=3): >>><<< 13531 1726882458.27565: stdout chunk (state=3): >>><<< 13531 1726882458.27585: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882458.27588: _low_level_execute_command(): starting 13531 1726882458.27591: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882458.1442947-15544-256772003026131/AnsiballZ_ping.py && sleep 0' 13531 1726882458.28262: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882458.28274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882458.28312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.28315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882458.28318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.28382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882458.28388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882458.28391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882458.28495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882458.41524: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13531 1726882458.42512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882458.42571: stderr chunk (state=3): >>><<< 13531 1726882458.42575: stdout chunk (state=3): >>><<< 13531 1726882458.42590: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882458.42612: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882458.1442947-15544-256772003026131/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882458.42620: _low_level_execute_command(): starting 13531 1726882458.42624: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882458.1442947-15544-256772003026131/ > /dev/null 2>&1 && sleep 0' 13531 1726882458.43081: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882458.43088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882458.43137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882458.43141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882458.43144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.43205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882458.43209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882458.43211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882458.43317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882458.45127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882458.45193: stderr chunk (state=3): >>><<< 13531 1726882458.45197: stdout chunk (state=3): >>><<< 13531 1726882458.45212: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882458.45218: handler run complete 13531 1726882458.45230: attempt loop complete, returning result 13531 1726882458.45233: _execute() done 13531 1726882458.45235: dumping result to json 13531 1726882458.45237: done dumping result, returning 13531 1726882458.45246: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4fd9-519d-000000000130] 13531 1726882458.45258: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000130 13531 1726882458.45342: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000130 13531 1726882458.45345: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 13531 1726882458.45427: no more pending results, returning what we have 13531 1726882458.45430: results queue empty 13531 1726882458.45431: checking for any_errors_fatal 13531 1726882458.45438: done checking for any_errors_fatal 13531 1726882458.45438: checking for max_fail_percentage 13531 1726882458.45440: done checking for max_fail_percentage 13531 1726882458.45441: checking to see if all hosts have failed and the running result is not ok 13531 1726882458.45442: done checking to see if all hosts have failed 13531 1726882458.45443: getting the remaining hosts for this loop 13531 1726882458.45445: done getting the remaining hosts for this loop 13531 1726882458.45448: getting the next task for host managed_node2 13531 1726882458.45457: done getting next task for host managed_node2 13531 1726882458.45460: ^ task is: TASK: meta (role_complete) 13531 1726882458.45465: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882458.45476: getting variables 13531 1726882458.45478: in VariableManager get_vars() 13531 1726882458.45530: Calling all_inventory to load vars for managed_node2 13531 1726882458.45533: Calling groups_inventory to load vars for managed_node2 13531 1726882458.45535: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882458.45545: Calling all_plugins_play to load vars for managed_node2 13531 1726882458.45547: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882458.45550: Calling groups_plugins_play to load vars for managed_node2 13531 1726882458.46827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882458.47993: done with get_vars() 13531 1726882458.48012: done getting variables 13531 1726882458.48072: done queuing things up, now waiting for results queue to drain 13531 1726882458.48074: results queue empty 13531 1726882458.48075: checking for any_errors_fatal 13531 1726882458.48076: done checking for any_errors_fatal 13531 1726882458.48077: checking for max_fail_percentage 13531 1726882458.48078: done checking for max_fail_percentage 13531 1726882458.48078: checking to see if all hosts have failed and the running result is not ok 13531 1726882458.48079: done checking to see if all hosts have failed 13531 1726882458.48079: getting the remaining hosts for this loop 13531 1726882458.48080: done getting the remaining hosts for this loop 13531 1726882458.48082: getting the next task for host managed_node2 13531 1726882458.48084: done getting next task for host managed_node2 13531 1726882458.48086: ^ task is: TASK: From the active connection, get the controller profile "{{ controller_profile }}" 13531 1726882458.48087: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882458.48089: getting variables 13531 1726882458.48089: in VariableManager get_vars() 13531 1726882458.48103: Calling all_inventory to load vars for managed_node2 13531 1726882458.48104: Calling groups_inventory to load vars for managed_node2 13531 1726882458.48106: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882458.48110: Calling all_plugins_play to load vars for managed_node2 13531 1726882458.48112: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882458.48113: Calling groups_plugins_play to load vars for managed_node2 13531 1726882458.48964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882458.50143: done with get_vars() 13531 1726882458.50166: done getting variables 13531 1726882458.50205: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882458.50317: variable 'controller_profile' from source: play vars TASK [From the active connection, get the controller profile "bond0"] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:200 Friday 20 September 2024 21:34:18 -0400 (0:00:00.414) 0:00:46.398 ****** 13531 1726882458.50343: entering _queue_task() for managed_node2/command 13531 1726882458.50606: worker is 1 (out of 1 available) 13531 1726882458.50620: exiting _queue_task() for managed_node2/command 13531 1726882458.50633: done queuing things up, now waiting for results queue to drain 13531 1726882458.50634: waiting for pending results... 13531 1726882458.50818: running TaskExecutor() for managed_node2/TASK: From the active connection, get the controller profile "bond0" 13531 1726882458.50886: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000160 13531 1726882458.50898: variable 'ansible_search_path' from source: unknown 13531 1726882458.50930: calling self._execute() 13531 1726882458.51004: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882458.51008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882458.51016: variable 'omit' from source: magic vars 13531 1726882458.51285: variable 'ansible_distribution_major_version' from source: facts 13531 1726882458.51297: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882458.51374: variable 'network_provider' from source: set_fact 13531 1726882458.51379: Evaluated conditional (network_provider == "nm"): True 13531 1726882458.51385: variable 'omit' from source: magic vars 13531 1726882458.51403: variable 'omit' from source: magic vars 13531 1726882458.51469: variable 'controller_profile' from source: play vars 13531 1726882458.51485: variable 'omit' from source: magic vars 13531 1726882458.51521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882458.51547: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882458.51565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882458.51578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882458.51588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882458.51611: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882458.51614: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882458.51616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882458.51688: Set connection var ansible_pipelining to False 13531 1726882458.51693: Set connection var ansible_timeout to 10 13531 1726882458.51698: Set connection var ansible_shell_executable to /bin/sh 13531 1726882458.51703: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882458.51705: Set connection var ansible_connection to ssh 13531 1726882458.51707: Set connection var ansible_shell_type to sh 13531 1726882458.51729: variable 'ansible_shell_executable' from source: unknown 13531 1726882458.51732: variable 'ansible_connection' from source: unknown 13531 1726882458.51734: variable 'ansible_module_compression' from source: unknown 13531 1726882458.51736: variable 'ansible_shell_type' from source: unknown 13531 1726882458.51738: variable 'ansible_shell_executable' from source: unknown 13531 1726882458.51740: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882458.51742: variable 'ansible_pipelining' from source: unknown 13531 1726882458.51745: variable 'ansible_timeout' from source: unknown 13531 1726882458.51750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882458.51847: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882458.51860: variable 'omit' from source: magic vars 13531 1726882458.51863: starting attempt loop 13531 1726882458.51866: running the handler 13531 1726882458.51880: _low_level_execute_command(): starting 13531 1726882458.51886: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882458.52427: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882458.52452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.52474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.52520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882458.52524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882458.52533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882458.52652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882458.54306: stdout chunk (state=3): >>>/root <<< 13531 1726882458.54404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882458.54460: stderr chunk (state=3): >>><<< 13531 1726882458.54465: stdout chunk (state=3): >>><<< 13531 1726882458.54487: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882458.54502: _low_level_execute_command(): starting 13531 1726882458.54508: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882458.5448756-15584-10302557420161 `" && echo ansible-tmp-1726882458.5448756-15584-10302557420161="` echo /root/.ansible/tmp/ansible-tmp-1726882458.5448756-15584-10302557420161 `" ) && sleep 0' 13531 1726882458.54975: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882458.54981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882458.55013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.55037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.55095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882458.55106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882458.55212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882458.57094: stdout chunk (state=3): >>>ansible-tmp-1726882458.5448756-15584-10302557420161=/root/.ansible/tmp/ansible-tmp-1726882458.5448756-15584-10302557420161 <<< 13531 1726882458.57201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882458.57278: stderr chunk (state=3): >>><<< 13531 1726882458.57289: stdout chunk (state=3): >>><<< 13531 1726882458.57318: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882458.5448756-15584-10302557420161=/root/.ansible/tmp/ansible-tmp-1726882458.5448756-15584-10302557420161 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882458.57350: variable 'ansible_module_compression' from source: unknown 13531 1726882458.57409: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13531 1726882458.57451: variable 'ansible_facts' from source: unknown 13531 1726882458.57514: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882458.5448756-15584-10302557420161/AnsiballZ_command.py 13531 1726882458.57619: Sending initial data 13531 1726882458.57623: Sent initial data (155 bytes) 13531 1726882458.58289: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882458.58294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882458.58326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.58337: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.58391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882458.58403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882458.58509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882458.60250: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882458.60343: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882458.60441: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpp_8proki /root/.ansible/tmp/ansible-tmp-1726882458.5448756-15584-10302557420161/AnsiballZ_command.py <<< 13531 1726882458.60533: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882458.61551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882458.61666: stderr chunk (state=3): >>><<< 13531 1726882458.61670: stdout chunk (state=3): >>><<< 13531 1726882458.61688: done transferring module to remote 13531 1726882458.61697: _low_level_execute_command(): starting 13531 1726882458.61701: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882458.5448756-15584-10302557420161/ /root/.ansible/tmp/ansible-tmp-1726882458.5448756-15584-10302557420161/AnsiballZ_command.py && sleep 0' 13531 1726882458.62148: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882458.62165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882458.62189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.62200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.62246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882458.62258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882458.62374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882458.64138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882458.64186: stderr chunk (state=3): >>><<< 13531 1726882458.64196: stdout chunk (state=3): >>><<< 13531 1726882458.64221: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882458.64225: _low_level_execute_command(): starting 13531 1726882458.64227: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882458.5448756-15584-10302557420161/AnsiballZ_command.py && sleep 0' 13531 1726882458.64670: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882458.64676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882458.64705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.64716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.64771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882458.64783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882458.64896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882458.80351: stdout chunk (state=3): >>> {"changed": true, "stdout": "connection.id: bond0\nconnection.uuid: 467176f0-8c25-4dd1-9498-f31f30164a10\nconnection.stable-id: --\nconnection.type: bond\nconnection.interface-name: nm-bond\nconnection.autoconnect: yes\nconnection.autoconnect-priority: 0\nconnection.autoconnect-retries: -1 (default)\nconnection.multi-connect: 0 (default)\nconnection.auth-retries: -1\nconnection.timestamp: 1726882450\nconnection.permissions: --\nconnection.zone: --\nconnection.controller: --\nconnection.master: --\nconnection.slave-type: --\nconnection.port-type: --\nconnection.autoconnect-slaves: -1 (default)\nconnection.autoconnect-ports: -1 (default)\nconnection.down-on-poweroff: -1 (default)\nconnection.secondaries: --\nconnection.gateway-ping-timeout: 0\nconnection.metered: unknown\nconnection.lldp: default\nconnection.mdns: -1 (default)\nconnection.llmnr: -1 (default)\nconnection.dns-over-tls: -1 (default)\nconnection.mptcp-flags: 0x0 (default)\nconnection.wait-device-timeout: -1\nconnection.wait-activation-delay: -1\nipv4.method: auto\nipv4.dns: --\nipv4.dns-search: --\nipv4.dns-options: --\nipv4.dns-priority: 0\nipv4.addresses: --\nipv4.gateway: --\nipv4.routes: --\nipv4.route-metric: 65535\nipv4.route-table: 0 (unspec)\nipv4.routing-rules: --\nipv4.replace-local-rule: -1 (default<<< 13531 1726882458.80390: stdout chunk (state=3): >>>)\nipv4.dhcp-send-release: -1 (default)\nipv4.ignore-auto-routes: no\nipv4.ignore-auto-dns: no\nipv4.dhcp-client-id: --\nipv4.dhcp-iaid: --\nipv4.dhcp-dscp: --\nipv4.dhcp-timeout: 0 (default)\nipv4.dhcp-send-hostname: yes\nipv4.dhcp-hostname: --\nipv4.dhcp-fqdn: --\nipv4.dhcp-hostname-flags: 0x0 (none)\nipv4.never-default: no\nipv4.may-fail: yes\nipv4.required-timeout: -1 (default)\nipv4.dad-timeout: -1 (default)\nipv4.dhcp-vendor-class-identifier: --\nipv4.link-local: 0 (default)\nipv4.dhcp-reject-servers: --\nipv4.auto-route-ext-gw: -1 (default)\nipv6.method: auto\nipv6.dns: --\nipv6.dns-search: --\nipv6.dns-options: --\nipv6.dns-priority: 0\nipv6.addresses: --\nipv6.gateway: --\nipv6.routes: --\nipv6.route-metric: -1\nipv6.route-table: 0 (unspec)\nipv6.routing-rules: --\nipv6.replace-local-rule: -1 (default)\nipv6.dhcp-send-release: -1 (default)\nipv6.ignore-auto-routes: no\nipv6.ignore-auto-dns: no\nipv6.never-default: no\nipv6.may-fail: yes\nipv6.required-timeout: -1 (default)\nipv6.ip6-privacy: -1 (default)\nipv6.temp-valid-lifetime: 0 (default)\nipv6.temp-preferred-lifetime: 0 (default)\nipv6.addr-gen-mode: default\nipv6.ra-timeout: 0 (default)\nipv6.mtu: auto\nipv6.dhcp-pd-hint: --\nipv6.dhcp-duid: --\nipv6.dhcp-iaid: --\nipv6.dhcp-timeout: 0 (default)\nipv6.dhcp-send-hostname: yes\nipv6.dhcp-hostname: --\nipv6.dhcp-hostname-flags: 0x0 (none)\nipv6.auto-route-ext-gw: -1 (default)\nipv6.token: --\nbond.options: mode=active-backup,miimon=110\nproxy.method: none\nproxy.browser-only: no\nproxy.pac-url: --\nproxy.pac-script: --\nGENERAL.NAME: bond0\nGENERAL.UUID: 467176f0-8c25-4dd1-9498-f31f30164a10\nGENERAL.DEVICES: nm-bond\nGENERAL.IP-IFACE: nm-bond\nGENERAL.STATE: activated\nGENERAL.DEFAULT: no\nGENERAL.DEFAULT6: yes\nGENERAL.SPEC-OBJECT: --\nGENERAL.VPN: no\nGENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/24\nGENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/18\nGENERAL.ZONE: --\nGENERAL.MASTER-PATH: --\nIP4.ADDRESS[1]: 192.0.2.223/24\nIP4.GATEWAY: 192.0.2.1\nIP4.ROUTE[1]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535\nIP4.ROUTE[2]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535\nIP4.DNS[1]: 192.0.2.1\nDHCP4.OPTION[1]: broadcast_address = 192.0.2.255\nDHCP4.OPTION[2]: dhcp_client_identifier = 01:46:d8:6a:e1:d6:ed\nDHCP4.OPTION[3]: dhcp_lease_time = 240\nDHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1\nDHCP4.OPTION[5]: domain_name_servers = 192.0.2.1\nDHCP4.OPTION[6]: expiry = 1726882690\nDHCP4.OPTION[7]: host_name = ip-10-31-11-158\nDHCP4.OPTION[8]: ip_address = 192.0.2.223\nDHCP4.OPTION[9]: next_server = 192.0.2.1\nDHCP4.OPTION[10]: requested_broadcast_address = 1\nDHCP4.OPTION[11]: requested_domain_name = 1\nDHCP4.OPTION[12]: requested_domain_name_servers = 1\nDHCP4.OPTION[13]: requested_domain_search = 1\nDHCP4.OPTION[14]: requested_host_name = 1\nDHCP4.OPTION[15]: requested_interface_mtu = 1\nDHCP4.OPTION[16]: requested_ms_classless_static_routes = 1\nDHCP4.OPTION[17]: requested_nis_domain = 1\nDHCP4.OPTION[18]: requested_nis_servers = 1\nDHCP4.OPTION[19]: requested_ntp_servers = 1\nDHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1\nDHCP4.OPTION[21]: requested_root_path = 1\nDHCP4.OPTION[22]: requested_routers = 1\nDHCP4.OPTION[23]: requested_static_routes = 1\nDHCP4.OPTION[24]: requested_subnet_mask = 1\nDHCP4.OPTION[25]: requested_time_offset = 1\nDHCP4.OPTION[26]: requested_wpad = 1\nDHCP4.OPTION[27]: routers = 192.0.2.1\nDHCP4.OPTION[28]: subnet_mask = 255.255.255.0\nIP6.ADDRESS[1]: 2001:db8::1ab/128\nIP6.ADDRESS[2]: 2001:db8::be8c:fde3:4c17:d92b/64\nIP6.ADDRESS[3]: fe80::c76c:d44b:b7e7:a551/64\nIP6.GATEWAY: fe80::7453:13ff:fed2:efa3\nIP6.ROUTE[1]: dst = 2001:db8::1ab/128, nh = ::, mt = 300\nIP6.ROUTE[2]: dst = fe80::/64, nh = ::, mt = 1024\nIP6.ROUTE[3]: dst = 2001:db8::/64, nh = ::, mt = 300\nIP6.ROUTE[4]: dst = ::/0, nh = fe80::7453:13ff:fed2:efa3, mt = 300\nIP6.DNS[1]: 2001:db8::24d4:43ff:fed5:d8f1\nIP6.DNS[2]: fe80::7453:13ff:fed2:efa3\nDHCP6.OPTION[1]: dhcp6_client_id = 00:04:62:cc:ef:b5:d7:1a:b6:b9:cc:b2:8e:19:f9:82:3d:b4\nDHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::24d4:43ff:fed5:d8f1\nDHCP6.OPTION[3]: fqdn_fqdn = ip-10-31-11-158\nDHCP6.OPTION[4]: iaid = 8c:3b:13:c0\nDHCP6.OPTION[5]: ip6_address = 2001:db8::1ab", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0"], "start": "2024-09-20 21:34:18.779855", "end": "2024-09-20 21:34:18.801200", "delta": "0:00:00.021345", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882458.81708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882458.81712: stdout chunk (state=3): >>><<< 13531 1726882458.81715: stderr chunk (state=3): >>><<< 13531 1726882458.81917: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "connection.id: bond0\nconnection.uuid: 467176f0-8c25-4dd1-9498-f31f30164a10\nconnection.stable-id: --\nconnection.type: bond\nconnection.interface-name: nm-bond\nconnection.autoconnect: yes\nconnection.autoconnect-priority: 0\nconnection.autoconnect-retries: -1 (default)\nconnection.multi-connect: 0 (default)\nconnection.auth-retries: -1\nconnection.timestamp: 1726882450\nconnection.permissions: --\nconnection.zone: --\nconnection.controller: --\nconnection.master: --\nconnection.slave-type: --\nconnection.port-type: --\nconnection.autoconnect-slaves: -1 (default)\nconnection.autoconnect-ports: -1 (default)\nconnection.down-on-poweroff: -1 (default)\nconnection.secondaries: --\nconnection.gateway-ping-timeout: 0\nconnection.metered: unknown\nconnection.lldp: default\nconnection.mdns: -1 (default)\nconnection.llmnr: -1 (default)\nconnection.dns-over-tls: -1 (default)\nconnection.mptcp-flags: 0x0 (default)\nconnection.wait-device-timeout: -1\nconnection.wait-activation-delay: -1\nipv4.method: auto\nipv4.dns: --\nipv4.dns-search: --\nipv4.dns-options: --\nipv4.dns-priority: 0\nipv4.addresses: --\nipv4.gateway: --\nipv4.routes: --\nipv4.route-metric: 65535\nipv4.route-table: 0 (unspec)\nipv4.routing-rules: --\nipv4.replace-local-rule: -1 (default)\nipv4.dhcp-send-release: -1 (default)\nipv4.ignore-auto-routes: no\nipv4.ignore-auto-dns: no\nipv4.dhcp-client-id: --\nipv4.dhcp-iaid: --\nipv4.dhcp-dscp: --\nipv4.dhcp-timeout: 0 (default)\nipv4.dhcp-send-hostname: yes\nipv4.dhcp-hostname: --\nipv4.dhcp-fqdn: --\nipv4.dhcp-hostname-flags: 0x0 (none)\nipv4.never-default: no\nipv4.may-fail: yes\nipv4.required-timeout: -1 (default)\nipv4.dad-timeout: -1 (default)\nipv4.dhcp-vendor-class-identifier: --\nipv4.link-local: 0 (default)\nipv4.dhcp-reject-servers: --\nipv4.auto-route-ext-gw: -1 (default)\nipv6.method: auto\nipv6.dns: --\nipv6.dns-search: --\nipv6.dns-options: --\nipv6.dns-priority: 0\nipv6.addresses: --\nipv6.gateway: --\nipv6.routes: --\nipv6.route-metric: -1\nipv6.route-table: 0 (unspec)\nipv6.routing-rules: --\nipv6.replace-local-rule: -1 (default)\nipv6.dhcp-send-release: -1 (default)\nipv6.ignore-auto-routes: no\nipv6.ignore-auto-dns: no\nipv6.never-default: no\nipv6.may-fail: yes\nipv6.required-timeout: -1 (default)\nipv6.ip6-privacy: -1 (default)\nipv6.temp-valid-lifetime: 0 (default)\nipv6.temp-preferred-lifetime: 0 (default)\nipv6.addr-gen-mode: default\nipv6.ra-timeout: 0 (default)\nipv6.mtu: auto\nipv6.dhcp-pd-hint: --\nipv6.dhcp-duid: --\nipv6.dhcp-iaid: --\nipv6.dhcp-timeout: 0 (default)\nipv6.dhcp-send-hostname: yes\nipv6.dhcp-hostname: --\nipv6.dhcp-hostname-flags: 0x0 (none)\nipv6.auto-route-ext-gw: -1 (default)\nipv6.token: --\nbond.options: mode=active-backup,miimon=110\nproxy.method: none\nproxy.browser-only: no\nproxy.pac-url: --\nproxy.pac-script: --\nGENERAL.NAME: bond0\nGENERAL.UUID: 467176f0-8c25-4dd1-9498-f31f30164a10\nGENERAL.DEVICES: nm-bond\nGENERAL.IP-IFACE: nm-bond\nGENERAL.STATE: activated\nGENERAL.DEFAULT: no\nGENERAL.DEFAULT6: yes\nGENERAL.SPEC-OBJECT: --\nGENERAL.VPN: no\nGENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/24\nGENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/18\nGENERAL.ZONE: --\nGENERAL.MASTER-PATH: --\nIP4.ADDRESS[1]: 192.0.2.223/24\nIP4.GATEWAY: 192.0.2.1\nIP4.ROUTE[1]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535\nIP4.ROUTE[2]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535\nIP4.DNS[1]: 192.0.2.1\nDHCP4.OPTION[1]: broadcast_address = 192.0.2.255\nDHCP4.OPTION[2]: dhcp_client_identifier = 01:46:d8:6a:e1:d6:ed\nDHCP4.OPTION[3]: dhcp_lease_time = 240\nDHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1\nDHCP4.OPTION[5]: domain_name_servers = 192.0.2.1\nDHCP4.OPTION[6]: expiry = 1726882690\nDHCP4.OPTION[7]: host_name = ip-10-31-11-158\nDHCP4.OPTION[8]: ip_address = 192.0.2.223\nDHCP4.OPTION[9]: next_server = 192.0.2.1\nDHCP4.OPTION[10]: requested_broadcast_address = 1\nDHCP4.OPTION[11]: requested_domain_name = 1\nDHCP4.OPTION[12]: requested_domain_name_servers = 1\nDHCP4.OPTION[13]: requested_domain_search = 1\nDHCP4.OPTION[14]: requested_host_name = 1\nDHCP4.OPTION[15]: requested_interface_mtu = 1\nDHCP4.OPTION[16]: requested_ms_classless_static_routes = 1\nDHCP4.OPTION[17]: requested_nis_domain = 1\nDHCP4.OPTION[18]: requested_nis_servers = 1\nDHCP4.OPTION[19]: requested_ntp_servers = 1\nDHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1\nDHCP4.OPTION[21]: requested_root_path = 1\nDHCP4.OPTION[22]: requested_routers = 1\nDHCP4.OPTION[23]: requested_static_routes = 1\nDHCP4.OPTION[24]: requested_subnet_mask = 1\nDHCP4.OPTION[25]: requested_time_offset = 1\nDHCP4.OPTION[26]: requested_wpad = 1\nDHCP4.OPTION[27]: routers = 192.0.2.1\nDHCP4.OPTION[28]: subnet_mask = 255.255.255.0\nIP6.ADDRESS[1]: 2001:db8::1ab/128\nIP6.ADDRESS[2]: 2001:db8::be8c:fde3:4c17:d92b/64\nIP6.ADDRESS[3]: fe80::c76c:d44b:b7e7:a551/64\nIP6.GATEWAY: fe80::7453:13ff:fed2:efa3\nIP6.ROUTE[1]: dst = 2001:db8::1ab/128, nh = ::, mt = 300\nIP6.ROUTE[2]: dst = fe80::/64, nh = ::, mt = 1024\nIP6.ROUTE[3]: dst = 2001:db8::/64, nh = ::, mt = 300\nIP6.ROUTE[4]: dst = ::/0, nh = fe80::7453:13ff:fed2:efa3, mt = 300\nIP6.DNS[1]: 2001:db8::24d4:43ff:fed5:d8f1\nIP6.DNS[2]: fe80::7453:13ff:fed2:efa3\nDHCP6.OPTION[1]: dhcp6_client_id = 00:04:62:cc:ef:b5:d7:1a:b6:b9:cc:b2:8e:19:f9:82:3d:b4\nDHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::24d4:43ff:fed5:d8f1\nDHCP6.OPTION[3]: fqdn_fqdn = ip-10-31-11-158\nDHCP6.OPTION[4]: iaid = 8c:3b:13:c0\nDHCP6.OPTION[5]: ip6_address = 2001:db8::1ab", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0"], "start": "2024-09-20 21:34:18.779855", "end": "2024-09-20 21:34:18.801200", "delta": "0:00:00.021345", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882458.81928: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882458.5448756-15584-10302557420161/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882458.81931: _low_level_execute_command(): starting 13531 1726882458.81934: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882458.5448756-15584-10302557420161/ > /dev/null 2>&1 && sleep 0' 13531 1726882458.83388: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882458.83392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882458.83422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.83427: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882458.83431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882458.83434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882458.83620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882458.83784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882458.83787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882458.83996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882458.85743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882458.85830: stderr chunk (state=3): >>><<< 13531 1726882458.85834: stdout chunk (state=3): >>><<< 13531 1726882458.86270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882458.86274: handler run complete 13531 1726882458.86282: Evaluated conditional (False): False 13531 1726882458.86285: attempt loop complete, returning result 13531 1726882458.86287: _execute() done 13531 1726882458.86289: dumping result to json 13531 1726882458.86291: done dumping result, returning 13531 1726882458.86293: done running TaskExecutor() for managed_node2/TASK: From the active connection, get the controller profile "bond0" [0e448fcc-3ce9-4fd9-519d-000000000160] 13531 1726882458.86295: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000160 ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0" ], "delta": "0:00:00.021345", "end": "2024-09-20 21:34:18.801200", "rc": 0, "start": "2024-09-20 21:34:18.779855" } STDOUT: connection.id: bond0 connection.uuid: 467176f0-8c25-4dd1-9498-f31f30164a10 connection.stable-id: -- connection.type: bond connection.interface-name: nm-bond connection.autoconnect: yes connection.autoconnect-priority: 0 connection.autoconnect-retries: -1 (default) connection.multi-connect: 0 (default) connection.auth-retries: -1 connection.timestamp: 1726882450 connection.permissions: -- connection.zone: -- connection.controller: -- connection.master: -- connection.slave-type: -- connection.port-type: -- connection.autoconnect-slaves: -1 (default) connection.autoconnect-ports: -1 (default) connection.down-on-poweroff: -1 (default) connection.secondaries: -- connection.gateway-ping-timeout: 0 connection.metered: unknown connection.lldp: default connection.mdns: -1 (default) connection.llmnr: -1 (default) connection.dns-over-tls: -1 (default) connection.mptcp-flags: 0x0 (default) connection.wait-device-timeout: -1 connection.wait-activation-delay: -1 ipv4.method: auto ipv4.dns: -- ipv4.dns-search: -- ipv4.dns-options: -- ipv4.dns-priority: 0 ipv4.addresses: -- ipv4.gateway: -- ipv4.routes: -- ipv4.route-metric: 65535 ipv4.route-table: 0 (unspec) ipv4.routing-rules: -- ipv4.replace-local-rule: -1 (default) ipv4.dhcp-send-release: -1 (default) ipv4.ignore-auto-routes: no ipv4.ignore-auto-dns: no ipv4.dhcp-client-id: -- ipv4.dhcp-iaid: -- ipv4.dhcp-dscp: -- ipv4.dhcp-timeout: 0 (default) ipv4.dhcp-send-hostname: yes ipv4.dhcp-hostname: -- ipv4.dhcp-fqdn: -- ipv4.dhcp-hostname-flags: 0x0 (none) ipv4.never-default: no ipv4.may-fail: yes ipv4.required-timeout: -1 (default) ipv4.dad-timeout: -1 (default) ipv4.dhcp-vendor-class-identifier: -- ipv4.link-local: 0 (default) ipv4.dhcp-reject-servers: -- ipv4.auto-route-ext-gw: -1 (default) ipv6.method: auto ipv6.dns: -- ipv6.dns-search: -- ipv6.dns-options: -- ipv6.dns-priority: 0 ipv6.addresses: -- ipv6.gateway: -- ipv6.routes: -- ipv6.route-metric: -1 ipv6.route-table: 0 (unspec) ipv6.routing-rules: -- ipv6.replace-local-rule: -1 (default) ipv6.dhcp-send-release: -1 (default) ipv6.ignore-auto-routes: no ipv6.ignore-auto-dns: no ipv6.never-default: no ipv6.may-fail: yes ipv6.required-timeout: -1 (default) ipv6.ip6-privacy: -1 (default) ipv6.temp-valid-lifetime: 0 (default) ipv6.temp-preferred-lifetime: 0 (default) ipv6.addr-gen-mode: default ipv6.ra-timeout: 0 (default) ipv6.mtu: auto ipv6.dhcp-pd-hint: -- ipv6.dhcp-duid: -- ipv6.dhcp-iaid: -- ipv6.dhcp-timeout: 0 (default) ipv6.dhcp-send-hostname: yes ipv6.dhcp-hostname: -- ipv6.dhcp-hostname-flags: 0x0 (none) ipv6.auto-route-ext-gw: -1 (default) ipv6.token: -- bond.options: mode=active-backup,miimon=110 proxy.method: none proxy.browser-only: no proxy.pac-url: -- proxy.pac-script: -- GENERAL.NAME: bond0 GENERAL.UUID: 467176f0-8c25-4dd1-9498-f31f30164a10 GENERAL.DEVICES: nm-bond GENERAL.IP-IFACE: nm-bond GENERAL.STATE: activated GENERAL.DEFAULT: no GENERAL.DEFAULT6: yes GENERAL.SPEC-OBJECT: -- GENERAL.VPN: no GENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/24 GENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/18 GENERAL.ZONE: -- GENERAL.MASTER-PATH: -- IP4.ADDRESS[1]: 192.0.2.223/24 IP4.GATEWAY: 192.0.2.1 IP4.ROUTE[1]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535 IP4.ROUTE[2]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535 IP4.DNS[1]: 192.0.2.1 DHCP4.OPTION[1]: broadcast_address = 192.0.2.255 DHCP4.OPTION[2]: dhcp_client_identifier = 01:46:d8:6a:e1:d6:ed DHCP4.OPTION[3]: dhcp_lease_time = 240 DHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1 DHCP4.OPTION[5]: domain_name_servers = 192.0.2.1 DHCP4.OPTION[6]: expiry = 1726882690 DHCP4.OPTION[7]: host_name = ip-10-31-11-158 DHCP4.OPTION[8]: ip_address = 192.0.2.223 DHCP4.OPTION[9]: next_server = 192.0.2.1 DHCP4.OPTION[10]: requested_broadcast_address = 1 DHCP4.OPTION[11]: requested_domain_name = 1 DHCP4.OPTION[12]: requested_domain_name_servers = 1 DHCP4.OPTION[13]: requested_domain_search = 1 DHCP4.OPTION[14]: requested_host_name = 1 DHCP4.OPTION[15]: requested_interface_mtu = 1 DHCP4.OPTION[16]: requested_ms_classless_static_routes = 1 DHCP4.OPTION[17]: requested_nis_domain = 1 DHCP4.OPTION[18]: requested_nis_servers = 1 DHCP4.OPTION[19]: requested_ntp_servers = 1 DHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1 DHCP4.OPTION[21]: requested_root_path = 1 DHCP4.OPTION[22]: requested_routers = 1 DHCP4.OPTION[23]: requested_static_routes = 1 DHCP4.OPTION[24]: requested_subnet_mask = 1 DHCP4.OPTION[25]: requested_time_offset = 1 DHCP4.OPTION[26]: requested_wpad = 1 DHCP4.OPTION[27]: routers = 192.0.2.1 DHCP4.OPTION[28]: subnet_mask = 255.255.255.0 IP6.ADDRESS[1]: 2001:db8::1ab/128 IP6.ADDRESS[2]: 2001:db8::be8c:fde3:4c17:d92b/64 IP6.ADDRESS[3]: fe80::c76c:d44b:b7e7:a551/64 IP6.GATEWAY: fe80::7453:13ff:fed2:efa3 IP6.ROUTE[1]: dst = 2001:db8::1ab/128, nh = ::, mt = 300 IP6.ROUTE[2]: dst = fe80::/64, nh = ::, mt = 1024 IP6.ROUTE[3]: dst = 2001:db8::/64, nh = ::, mt = 300 IP6.ROUTE[4]: dst = ::/0, nh = fe80::7453:13ff:fed2:efa3, mt = 300 IP6.DNS[1]: 2001:db8::24d4:43ff:fed5:d8f1 IP6.DNS[2]: fe80::7453:13ff:fed2:efa3 DHCP6.OPTION[1]: dhcp6_client_id = 00:04:62:cc:ef:b5:d7:1a:b6:b9:cc:b2:8e:19:f9:82:3d:b4 DHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::24d4:43ff:fed5:d8f1 DHCP6.OPTION[3]: fqdn_fqdn = ip-10-31-11-158 DHCP6.OPTION[4]: iaid = 8c:3b:13:c0 DHCP6.OPTION[5]: ip6_address = 2001:db8::1ab 13531 1726882458.86460: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000160 13531 1726882458.86536: no more pending results, returning what we have 13531 1726882458.86544: results queue empty 13531 1726882458.86545: checking for any_errors_fatal 13531 1726882458.86547: done checking for any_errors_fatal 13531 1726882458.86548: checking for max_fail_percentage 13531 1726882458.86550: done checking for max_fail_percentage 13531 1726882458.86551: checking to see if all hosts have failed and the running result is not ok 13531 1726882458.86552: done checking to see if all hosts have failed 13531 1726882458.86553: getting the remaining hosts for this loop 13531 1726882458.86554: done getting the remaining hosts for this loop 13531 1726882458.86558: getting the next task for host managed_node2 13531 1726882458.86563: done getting next task for host managed_node2 13531 1726882458.86567: ^ task is: TASK: Assert that the controller profile is activated 13531 1726882458.86569: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882458.86573: getting variables 13531 1726882458.86574: in VariableManager get_vars() 13531 1726882458.86628: Calling all_inventory to load vars for managed_node2 13531 1726882458.86631: Calling groups_inventory to load vars for managed_node2 13531 1726882458.86633: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882458.86645: Calling all_plugins_play to load vars for managed_node2 13531 1726882458.86648: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882458.86652: Calling groups_plugins_play to load vars for managed_node2 13531 1726882458.88405: WORKER PROCESS EXITING 13531 1726882458.89674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882458.93034: done with get_vars() 13531 1726882458.93074: done getting variables 13531 1726882458.93137: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:207 Friday 20 September 2024 21:34:18 -0400 (0:00:00.428) 0:00:46.826 ****** 13531 1726882458.93170: entering _queue_task() for managed_node2/assert 13531 1726882458.94113: worker is 1 (out of 1 available) 13531 1726882458.94126: exiting _queue_task() for managed_node2/assert 13531 1726882458.94141: done queuing things up, now waiting for results queue to drain 13531 1726882458.94142: waiting for pending results... 13531 1726882458.94370: running TaskExecutor() for managed_node2/TASK: Assert that the controller profile is activated 13531 1726882458.95278: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000161 13531 1726882458.95301: variable 'ansible_search_path' from source: unknown 13531 1726882458.95347: calling self._execute() 13531 1726882458.95459: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882458.95476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882458.95491: variable 'omit' from source: magic vars 13531 1726882458.95856: variable 'ansible_distribution_major_version' from source: facts 13531 1726882458.96585: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882458.96718: variable 'network_provider' from source: set_fact 13531 1726882458.96730: Evaluated conditional (network_provider == "nm"): True 13531 1726882458.96742: variable 'omit' from source: magic vars 13531 1726882458.96775: variable 'omit' from source: magic vars 13531 1726882458.96884: variable 'controller_profile' from source: play vars 13531 1726882458.96908: variable 'omit' from source: magic vars 13531 1726882458.96961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882458.97001: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882458.97027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882458.97049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882458.97072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882458.97109: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882458.97117: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882458.97125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882458.97237: Set connection var ansible_pipelining to False 13531 1726882458.97249: Set connection var ansible_timeout to 10 13531 1726882458.97266: Set connection var ansible_shell_executable to /bin/sh 13531 1726882458.97277: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882458.97284: Set connection var ansible_connection to ssh 13531 1726882458.97290: Set connection var ansible_shell_type to sh 13531 1726882458.97323: variable 'ansible_shell_executable' from source: unknown 13531 1726882458.97331: variable 'ansible_connection' from source: unknown 13531 1726882458.97975: variable 'ansible_module_compression' from source: unknown 13531 1726882458.97983: variable 'ansible_shell_type' from source: unknown 13531 1726882458.97990: variable 'ansible_shell_executable' from source: unknown 13531 1726882458.97997: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882458.98005: variable 'ansible_pipelining' from source: unknown 13531 1726882458.98013: variable 'ansible_timeout' from source: unknown 13531 1726882458.98022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882458.98170: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882458.98190: variable 'omit' from source: magic vars 13531 1726882458.98201: starting attempt loop 13531 1726882458.98209: running the handler 13531 1726882458.98392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882459.02199: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882459.02279: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882459.02322: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882459.02370: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882459.02415: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882459.02545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882459.02642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882459.02680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882459.02725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882459.02744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882459.02860: variable 'active_controller_profile' from source: set_fact 13531 1726882459.02897: Evaluated conditional (active_controller_profile.stdout | length != 0): True 13531 1726882459.02908: handler run complete 13531 1726882459.02928: attempt loop complete, returning result 13531 1726882459.02935: _execute() done 13531 1726882459.02942: dumping result to json 13531 1726882459.02949: done dumping result, returning 13531 1726882459.02966: done running TaskExecutor() for managed_node2/TASK: Assert that the controller profile is activated [0e448fcc-3ce9-4fd9-519d-000000000161] 13531 1726882459.02979: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000161 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 13531 1726882459.03133: no more pending results, returning what we have 13531 1726882459.03136: results queue empty 13531 1726882459.03137: checking for any_errors_fatal 13531 1726882459.03147: done checking for any_errors_fatal 13531 1726882459.03148: checking for max_fail_percentage 13531 1726882459.03150: done checking for max_fail_percentage 13531 1726882459.03151: checking to see if all hosts have failed and the running result is not ok 13531 1726882459.03151: done checking to see if all hosts have failed 13531 1726882459.03152: getting the remaining hosts for this loop 13531 1726882459.03153: done getting the remaining hosts for this loop 13531 1726882459.03157: getting the next task for host managed_node2 13531 1726882459.03162: done getting next task for host managed_node2 13531 1726882459.03171: ^ task is: TASK: Get the controller device details 13531 1726882459.03173: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882459.03177: getting variables 13531 1726882459.03179: in VariableManager get_vars() 13531 1726882459.03234: Calling all_inventory to load vars for managed_node2 13531 1726882459.03245: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000161 13531 1726882459.03250: Calling groups_inventory to load vars for managed_node2 13531 1726882459.03256: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882459.03262: WORKER PROCESS EXITING 13531 1726882459.03275: Calling all_plugins_play to load vars for managed_node2 13531 1726882459.03278: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882459.03281: Calling groups_plugins_play to load vars for managed_node2 13531 1726882459.05447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882459.07951: done with get_vars() 13531 1726882459.07979: done getting variables 13531 1726882459.08042: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the controller device details] *************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:214 Friday 20 September 2024 21:34:19 -0400 (0:00:00.149) 0:00:46.975 ****** 13531 1726882459.08074: entering _queue_task() for managed_node2/command 13531 1726882459.08441: worker is 1 (out of 1 available) 13531 1726882459.08454: exiting _queue_task() for managed_node2/command 13531 1726882459.08468: done queuing things up, now waiting for results queue to drain 13531 1726882459.08470: waiting for pending results... 13531 1726882459.09469: running TaskExecutor() for managed_node2/TASK: Get the controller device details 13531 1726882459.09643: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000162 13531 1726882459.09671: variable 'ansible_search_path' from source: unknown 13531 1726882459.09715: calling self._execute() 13531 1726882459.09826: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882459.09979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882459.09995: variable 'omit' from source: magic vars 13531 1726882459.10372: variable 'ansible_distribution_major_version' from source: facts 13531 1726882459.10953: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882459.11091: variable 'network_provider' from source: set_fact 13531 1726882459.11103: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882459.11111: when evaluation is False, skipping this task 13531 1726882459.11118: _execute() done 13531 1726882459.11125: dumping result to json 13531 1726882459.11132: done dumping result, returning 13531 1726882459.11178: done running TaskExecutor() for managed_node2/TASK: Get the controller device details [0e448fcc-3ce9-4fd9-519d-000000000162] 13531 1726882459.11191: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000162 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13531 1726882459.11352: no more pending results, returning what we have 13531 1726882459.11357: results queue empty 13531 1726882459.11358: checking for any_errors_fatal 13531 1726882459.11370: done checking for any_errors_fatal 13531 1726882459.11371: checking for max_fail_percentage 13531 1726882459.11373: done checking for max_fail_percentage 13531 1726882459.11375: checking to see if all hosts have failed and the running result is not ok 13531 1726882459.11375: done checking to see if all hosts have failed 13531 1726882459.11376: getting the remaining hosts for this loop 13531 1726882459.11378: done getting the remaining hosts for this loop 13531 1726882459.11381: getting the next task for host managed_node2 13531 1726882459.11389: done getting next task for host managed_node2 13531 1726882459.11392: ^ task is: TASK: Assert that the controller profile is activated 13531 1726882459.11395: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882459.11399: getting variables 13531 1726882459.11401: in VariableManager get_vars() 13531 1726882459.11466: Calling all_inventory to load vars for managed_node2 13531 1726882459.11470: Calling groups_inventory to load vars for managed_node2 13531 1726882459.11472: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882459.11487: Calling all_plugins_play to load vars for managed_node2 13531 1726882459.11491: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882459.11494: Calling groups_plugins_play to load vars for managed_node2 13531 1726882459.12191: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000162 13531 1726882459.12194: WORKER PROCESS EXITING 13531 1726882459.14498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882459.18012: done with get_vars() 13531 1726882459.18056: done getting variables 13531 1726882459.18324: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:221 Friday 20 September 2024 21:34:19 -0400 (0:00:00.102) 0:00:47.078 ****** 13531 1726882459.18357: entering _queue_task() for managed_node2/assert 13531 1726882459.19098: worker is 1 (out of 1 available) 13531 1726882459.19111: exiting _queue_task() for managed_node2/assert 13531 1726882459.19125: done queuing things up, now waiting for results queue to drain 13531 1726882459.19126: waiting for pending results... 13531 1726882459.20216: running TaskExecutor() for managed_node2/TASK: Assert that the controller profile is activated 13531 1726882459.21269: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000163 13531 1726882459.21618: variable 'ansible_search_path' from source: unknown 13531 1726882459.21668: calling self._execute() 13531 1726882459.21776: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882459.22479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882459.22494: variable 'omit' from source: magic vars 13531 1726882459.22878: variable 'ansible_distribution_major_version' from source: facts 13531 1726882459.22896: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882459.23017: variable 'network_provider' from source: set_fact 13531 1726882459.23029: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882459.23037: when evaluation is False, skipping this task 13531 1726882459.23045: _execute() done 13531 1726882459.23052: dumping result to json 13531 1726882459.23066: done dumping result, returning 13531 1726882459.23082: done running TaskExecutor() for managed_node2/TASK: Assert that the controller profile is activated [0e448fcc-3ce9-4fd9-519d-000000000163] 13531 1726882459.23094: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000163 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13531 1726882459.23252: no more pending results, returning what we have 13531 1726882459.23256: results queue empty 13531 1726882459.23257: checking for any_errors_fatal 13531 1726882459.23267: done checking for any_errors_fatal 13531 1726882459.23268: checking for max_fail_percentage 13531 1726882459.23270: done checking for max_fail_percentage 13531 1726882459.23270: checking to see if all hosts have failed and the running result is not ok 13531 1726882459.23271: done checking to see if all hosts have failed 13531 1726882459.23272: getting the remaining hosts for this loop 13531 1726882459.23273: done getting the remaining hosts for this loop 13531 1726882459.23277: getting the next task for host managed_node2 13531 1726882459.23288: done getting next task for host managed_node2 13531 1726882459.23294: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13531 1726882459.23298: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882459.23317: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000163 13531 1726882459.23321: WORKER PROCESS EXITING 13531 1726882459.23337: getting variables 13531 1726882459.23340: in VariableManager get_vars() 13531 1726882459.23410: Calling all_inventory to load vars for managed_node2 13531 1726882459.23413: Calling groups_inventory to load vars for managed_node2 13531 1726882459.23416: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882459.23429: Calling all_plugins_play to load vars for managed_node2 13531 1726882459.23433: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882459.23436: Calling groups_plugins_play to load vars for managed_node2 13531 1726882459.26792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882459.29318: done with get_vars() 13531 1726882459.29351: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:34:19 -0400 (0:00:00.110) 0:00:47.189 ****** 13531 1726882459.29444: entering _queue_task() for managed_node2/include_tasks 13531 1726882459.30467: worker is 1 (out of 1 available) 13531 1726882459.30479: exiting _queue_task() for managed_node2/include_tasks 13531 1726882459.30493: done queuing things up, now waiting for results queue to drain 13531 1726882459.30494: waiting for pending results... 13531 1726882459.31863: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13531 1726882459.32039: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000016c 13531 1726882459.32068: variable 'ansible_search_path' from source: unknown 13531 1726882459.32077: variable 'ansible_search_path' from source: unknown 13531 1726882459.32121: calling self._execute() 13531 1726882459.32222: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882459.32877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882459.32891: variable 'omit' from source: magic vars 13531 1726882459.33273: variable 'ansible_distribution_major_version' from source: facts 13531 1726882459.33292: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882459.33303: _execute() done 13531 1726882459.33310: dumping result to json 13531 1726882459.33317: done dumping result, returning 13531 1726882459.33329: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4fd9-519d-00000000016c] 13531 1726882459.33340: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000016c 13531 1726882459.33498: no more pending results, returning what we have 13531 1726882459.33503: in VariableManager get_vars() 13531 1726882459.33566: Calling all_inventory to load vars for managed_node2 13531 1726882459.33569: Calling groups_inventory to load vars for managed_node2 13531 1726882459.33571: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882459.33578: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000016c 13531 1726882459.33582: WORKER PROCESS EXITING 13531 1726882459.33596: Calling all_plugins_play to load vars for managed_node2 13531 1726882459.33599: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882459.33602: Calling groups_plugins_play to load vars for managed_node2 13531 1726882459.35700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882459.39852: done with get_vars() 13531 1726882459.39887: variable 'ansible_search_path' from source: unknown 13531 1726882459.39888: variable 'ansible_search_path' from source: unknown 13531 1726882459.39932: we have included files to process 13531 1726882459.39933: generating all_blocks data 13531 1726882459.39936: done generating all_blocks data 13531 1726882459.39942: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882459.39943: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882459.39946: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13531 1726882459.40535: done processing included file 13531 1726882459.40537: iterating over new_blocks loaded from include file 13531 1726882459.40538: in VariableManager get_vars() 13531 1726882459.40576: done with get_vars() 13531 1726882459.40578: filtering new block on tags 13531 1726882459.40611: done filtering new block on tags 13531 1726882459.40614: in VariableManager get_vars() 13531 1726882459.40647: done with get_vars() 13531 1726882459.40648: filtering new block on tags 13531 1726882459.40692: done filtering new block on tags 13531 1726882459.40695: in VariableManager get_vars() 13531 1726882459.40725: done with get_vars() 13531 1726882459.40727: filtering new block on tags 13531 1726882459.40766: done filtering new block on tags 13531 1726882459.40768: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 13531 1726882459.40774: extending task lists for all hosts with included blocks 13531 1726882459.41826: done extending task lists 13531 1726882459.41827: done processing included files 13531 1726882459.41828: results queue empty 13531 1726882459.41829: checking for any_errors_fatal 13531 1726882459.41833: done checking for any_errors_fatal 13531 1726882459.41833: checking for max_fail_percentage 13531 1726882459.41834: done checking for max_fail_percentage 13531 1726882459.41835: checking to see if all hosts have failed and the running result is not ok 13531 1726882459.41836: done checking to see if all hosts have failed 13531 1726882459.41837: getting the remaining hosts for this loop 13531 1726882459.41838: done getting the remaining hosts for this loop 13531 1726882459.41841: getting the next task for host managed_node2 13531 1726882459.41845: done getting next task for host managed_node2 13531 1726882459.41848: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13531 1726882459.41852: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882459.41865: getting variables 13531 1726882459.41866: in VariableManager get_vars() 13531 1726882459.41889: Calling all_inventory to load vars for managed_node2 13531 1726882459.41892: Calling groups_inventory to load vars for managed_node2 13531 1726882459.41894: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882459.41899: Calling all_plugins_play to load vars for managed_node2 13531 1726882459.41902: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882459.41905: Calling groups_plugins_play to load vars for managed_node2 13531 1726882459.47913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882459.49593: done with get_vars() 13531 1726882459.49621: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:34:19 -0400 (0:00:00.202) 0:00:47.392 ****** 13531 1726882459.49702: entering _queue_task() for managed_node2/setup 13531 1726882459.50030: worker is 1 (out of 1 available) 13531 1726882459.50042: exiting _queue_task() for managed_node2/setup 13531 1726882459.50054: done queuing things up, now waiting for results queue to drain 13531 1726882459.50056: waiting for pending results... 13531 1726882459.50375: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13531 1726882459.50538: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000914 13531 1726882459.50562: variable 'ansible_search_path' from source: unknown 13531 1726882459.50568: variable 'ansible_search_path' from source: unknown 13531 1726882459.50602: calling self._execute() 13531 1726882459.50699: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882459.50703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882459.50714: variable 'omit' from source: magic vars 13531 1726882459.51089: variable 'ansible_distribution_major_version' from source: facts 13531 1726882459.51101: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882459.51323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882459.53668: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882459.53742: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882459.53782: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882459.53815: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882459.53841: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882459.53919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882459.53946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882459.53971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882459.54015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882459.54029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882459.54080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882459.54106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882459.54130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882459.54169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882459.54183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882459.54357: variable '__network_required_facts' from source: role '' defaults 13531 1726882459.54365: variable 'ansible_facts' from source: unknown 13531 1726882459.55286: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13531 1726882459.55290: when evaluation is False, skipping this task 13531 1726882459.55293: _execute() done 13531 1726882459.55296: dumping result to json 13531 1726882459.55298: done dumping result, returning 13531 1726882459.55305: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4fd9-519d-000000000914] 13531 1726882459.55310: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000914 13531 1726882459.55417: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000914 13531 1726882459.55420: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882459.55467: no more pending results, returning what we have 13531 1726882459.55471: results queue empty 13531 1726882459.55472: checking for any_errors_fatal 13531 1726882459.55474: done checking for any_errors_fatal 13531 1726882459.55474: checking for max_fail_percentage 13531 1726882459.55476: done checking for max_fail_percentage 13531 1726882459.55477: checking to see if all hosts have failed and the running result is not ok 13531 1726882459.55478: done checking to see if all hosts have failed 13531 1726882459.55479: getting the remaining hosts for this loop 13531 1726882459.55480: done getting the remaining hosts for this loop 13531 1726882459.55483: getting the next task for host managed_node2 13531 1726882459.55494: done getting next task for host managed_node2 13531 1726882459.55499: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13531 1726882459.55504: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882459.55526: getting variables 13531 1726882459.55528: in VariableManager get_vars() 13531 1726882459.55590: Calling all_inventory to load vars for managed_node2 13531 1726882459.55593: Calling groups_inventory to load vars for managed_node2 13531 1726882459.55596: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882459.55608: Calling all_plugins_play to load vars for managed_node2 13531 1726882459.55611: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882459.55614: Calling groups_plugins_play to load vars for managed_node2 13531 1726882459.57373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882459.59091: done with get_vars() 13531 1726882459.59119: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:34:19 -0400 (0:00:00.095) 0:00:47.487 ****** 13531 1726882459.59227: entering _queue_task() for managed_node2/stat 13531 1726882459.59952: worker is 1 (out of 1 available) 13531 1726882459.59965: exiting _queue_task() for managed_node2/stat 13531 1726882459.59979: done queuing things up, now waiting for results queue to drain 13531 1726882459.59980: waiting for pending results... 13531 1726882459.60762: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 13531 1726882459.61162: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000916 13531 1726882459.61886: variable 'ansible_search_path' from source: unknown 13531 1726882459.61895: variable 'ansible_search_path' from source: unknown 13531 1726882459.61939: calling self._execute() 13531 1726882459.62042: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882459.62059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882459.62078: variable 'omit' from source: magic vars 13531 1726882459.62468: variable 'ansible_distribution_major_version' from source: facts 13531 1726882459.62487: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882459.63224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882459.63505: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882459.63558: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882459.63622: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882459.64303: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882459.64396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882459.64426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882459.64459: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882459.64495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882459.64598: variable '__network_is_ostree' from source: set_fact 13531 1726882459.64610: Evaluated conditional (not __network_is_ostree is defined): False 13531 1726882459.64618: when evaluation is False, skipping this task 13531 1726882459.64625: _execute() done 13531 1726882459.64631: dumping result to json 13531 1726882459.64637: done dumping result, returning 13531 1726882459.64648: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4fd9-519d-000000000916] 13531 1726882459.64662: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000916 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13531 1726882459.64824: no more pending results, returning what we have 13531 1726882459.64829: results queue empty 13531 1726882459.64830: checking for any_errors_fatal 13531 1726882459.64837: done checking for any_errors_fatal 13531 1726882459.64837: checking for max_fail_percentage 13531 1726882459.64839: done checking for max_fail_percentage 13531 1726882459.64840: checking to see if all hosts have failed and the running result is not ok 13531 1726882459.64841: done checking to see if all hosts have failed 13531 1726882459.64842: getting the remaining hosts for this loop 13531 1726882459.64843: done getting the remaining hosts for this loop 13531 1726882459.64846: getting the next task for host managed_node2 13531 1726882459.64852: done getting next task for host managed_node2 13531 1726882459.64856: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13531 1726882459.64861: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882459.64886: getting variables 13531 1726882459.64888: in VariableManager get_vars() 13531 1726882459.64942: Calling all_inventory to load vars for managed_node2 13531 1726882459.64945: Calling groups_inventory to load vars for managed_node2 13531 1726882459.64947: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882459.64959: Calling all_plugins_play to load vars for managed_node2 13531 1726882459.64962: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882459.64967: Calling groups_plugins_play to load vars for managed_node2 13531 1726882459.66172: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000916 13531 1726882459.66177: WORKER PROCESS EXITING 13531 1726882459.67070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882459.70871: done with get_vars() 13531 1726882459.70903: done getting variables 13531 1726882459.71168: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:34:19 -0400 (0:00:00.119) 0:00:47.607 ****** 13531 1726882459.71210: entering _queue_task() for managed_node2/set_fact 13531 1726882459.71842: worker is 1 (out of 1 available) 13531 1726882459.71855: exiting _queue_task() for managed_node2/set_fact 13531 1726882459.72372: done queuing things up, now waiting for results queue to drain 13531 1726882459.72374: waiting for pending results... 13531 1726882459.72907: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13531 1726882459.73304: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000917 13531 1726882459.73318: variable 'ansible_search_path' from source: unknown 13531 1726882459.73321: variable 'ansible_search_path' from source: unknown 13531 1726882459.73360: calling self._execute() 13531 1726882459.73666: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882459.73670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882459.73707: variable 'omit' from source: magic vars 13531 1726882459.74769: variable 'ansible_distribution_major_version' from source: facts 13531 1726882459.74773: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882459.74820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882459.75319: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882459.75484: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882459.75542: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882459.75698: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882459.75899: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882459.75923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882459.75950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882459.75981: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882459.76087: variable '__network_is_ostree' from source: set_fact 13531 1726882459.76095: Evaluated conditional (not __network_is_ostree is defined): False 13531 1726882459.76218: when evaluation is False, skipping this task 13531 1726882459.76222: _execute() done 13531 1726882459.76224: dumping result to json 13531 1726882459.76227: done dumping result, returning 13531 1726882459.76235: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4fd9-519d-000000000917] 13531 1726882459.76242: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000917 13531 1726882459.76350: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000917 13531 1726882459.76354: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13531 1726882459.76405: no more pending results, returning what we have 13531 1726882459.76410: results queue empty 13531 1726882459.76411: checking for any_errors_fatal 13531 1726882459.76418: done checking for any_errors_fatal 13531 1726882459.76419: checking for max_fail_percentage 13531 1726882459.76421: done checking for max_fail_percentage 13531 1726882459.76422: checking to see if all hosts have failed and the running result is not ok 13531 1726882459.76423: done checking to see if all hosts have failed 13531 1726882459.76423: getting the remaining hosts for this loop 13531 1726882459.76425: done getting the remaining hosts for this loop 13531 1726882459.76428: getting the next task for host managed_node2 13531 1726882459.76439: done getting next task for host managed_node2 13531 1726882459.76443: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13531 1726882459.76449: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882459.76475: getting variables 13531 1726882459.76477: in VariableManager get_vars() 13531 1726882459.76535: Calling all_inventory to load vars for managed_node2 13531 1726882459.76538: Calling groups_inventory to load vars for managed_node2 13531 1726882459.76541: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882459.76552: Calling all_plugins_play to load vars for managed_node2 13531 1726882459.76558: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882459.76561: Calling groups_plugins_play to load vars for managed_node2 13531 1726882459.80054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882459.84754: done with get_vars() 13531 1726882459.85121: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:34:19 -0400 (0:00:00.140) 0:00:47.747 ****** 13531 1726882459.85272: entering _queue_task() for managed_node2/service_facts 13531 1726882459.85615: worker is 1 (out of 1 available) 13531 1726882459.85628: exiting _queue_task() for managed_node2/service_facts 13531 1726882459.85645: done queuing things up, now waiting for results queue to drain 13531 1726882459.85647: waiting for pending results... 13531 1726882459.85946: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 13531 1726882459.86133: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000919 13531 1726882459.86153: variable 'ansible_search_path' from source: unknown 13531 1726882459.86160: variable 'ansible_search_path' from source: unknown 13531 1726882459.86211: calling self._execute() 13531 1726882459.86322: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882459.86333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882459.86346: variable 'omit' from source: magic vars 13531 1726882459.86740: variable 'ansible_distribution_major_version' from source: facts 13531 1726882459.86757: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882459.86770: variable 'omit' from source: magic vars 13531 1726882459.86865: variable 'omit' from source: magic vars 13531 1726882459.86904: variable 'omit' from source: magic vars 13531 1726882459.86958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882459.86999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882459.87025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882459.87049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882459.87077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882459.87112: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882459.87121: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882459.87130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882459.87239: Set connection var ansible_pipelining to False 13531 1726882459.87249: Set connection var ansible_timeout to 10 13531 1726882459.87258: Set connection var ansible_shell_executable to /bin/sh 13531 1726882459.87272: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882459.87279: Set connection var ansible_connection to ssh 13531 1726882459.87288: Set connection var ansible_shell_type to sh 13531 1726882459.87317: variable 'ansible_shell_executable' from source: unknown 13531 1726882459.87324: variable 'ansible_connection' from source: unknown 13531 1726882459.87330: variable 'ansible_module_compression' from source: unknown 13531 1726882459.87335: variable 'ansible_shell_type' from source: unknown 13531 1726882459.87340: variable 'ansible_shell_executable' from source: unknown 13531 1726882459.87345: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882459.87351: variable 'ansible_pipelining' from source: unknown 13531 1726882459.87357: variable 'ansible_timeout' from source: unknown 13531 1726882459.87365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882459.87588: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882459.87614: variable 'omit' from source: magic vars 13531 1726882459.87625: starting attempt loop 13531 1726882459.87631: running the handler 13531 1726882459.87650: _low_level_execute_command(): starting 13531 1726882459.87667: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882459.88441: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882459.88458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882459.88481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882459.88503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882459.88548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882459.88562: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882459.88582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882459.88606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882459.88618: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882459.88631: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882459.88644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882459.88660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882459.88679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882459.88697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882459.88713: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882459.88728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882459.88805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882459.88835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882459.88852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882459.88998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882459.90719: stdout chunk (state=3): >>>/root <<< 13531 1726882459.90921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882459.90925: stdout chunk (state=3): >>><<< 13531 1726882459.90928: stderr chunk (state=3): >>><<< 13531 1726882459.91062: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882459.91070: _low_level_execute_command(): starting 13531 1726882459.91073: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882459.9095557-15649-258045738963599 `" && echo ansible-tmp-1726882459.9095557-15649-258045738963599="` echo /root/.ansible/tmp/ansible-tmp-1726882459.9095557-15649-258045738963599 `" ) && sleep 0' 13531 1726882459.92352: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882459.92356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882459.92396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882459.92400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882459.92403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882459.92473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882459.92478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882459.92685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882459.93198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882459.95222: stdout chunk (state=3): >>>ansible-tmp-1726882459.9095557-15649-258045738963599=/root/.ansible/tmp/ansible-tmp-1726882459.9095557-15649-258045738963599 <<< 13531 1726882459.95318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882459.95403: stderr chunk (state=3): >>><<< 13531 1726882459.95407: stdout chunk (state=3): >>><<< 13531 1726882459.95689: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882459.9095557-15649-258045738963599=/root/.ansible/tmp/ansible-tmp-1726882459.9095557-15649-258045738963599 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882459.95693: variable 'ansible_module_compression' from source: unknown 13531 1726882459.95696: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13531 1726882459.95698: variable 'ansible_facts' from source: unknown 13531 1726882459.95700: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882459.9095557-15649-258045738963599/AnsiballZ_service_facts.py 13531 1726882459.96423: Sending initial data 13531 1726882459.96426: Sent initial data (162 bytes) 13531 1726882459.98449: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882459.98467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882459.98482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882459.98500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882459.98548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882459.98627: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882459.98641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882459.98658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882459.98672: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882459.98683: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882459.98695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882459.98707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882459.98726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882459.98738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882459.98748: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882459.98760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882459.98843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882459.98974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882459.98990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882459.99124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882460.00954: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882460.01049: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882460.01152: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmplnubxb9t /root/.ansible/tmp/ansible-tmp-1726882459.9095557-15649-258045738963599/AnsiballZ_service_facts.py <<< 13531 1726882460.01267: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882460.02939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882460.03022: stderr chunk (state=3): >>><<< 13531 1726882460.03026: stdout chunk (state=3): >>><<< 13531 1726882460.03047: done transferring module to remote 13531 1726882460.03062: _low_level_execute_command(): starting 13531 1726882460.03076: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882459.9095557-15649-258045738963599/ /root/.ansible/tmp/ansible-tmp-1726882459.9095557-15649-258045738963599/AnsiballZ_service_facts.py && sleep 0' 13531 1726882460.05222: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882460.05231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882460.05256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882460.05281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882460.05312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882460.05326: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882460.05336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882460.05357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882460.05367: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882460.05374: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882460.05382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882460.05392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882460.05408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882460.05413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882460.05434: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882460.05441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882460.05528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882460.05534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882460.05545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882460.05794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882460.07696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882460.07699: stdout chunk (state=3): >>><<< 13531 1726882460.07706: stderr chunk (state=3): >>><<< 13531 1726882460.07724: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882460.07727: _low_level_execute_command(): starting 13531 1726882460.07733: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882459.9095557-15649-258045738963599/AnsiballZ_service_facts.py && sleep 0' 13531 1726882460.09077: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882460.10084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882460.10095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882460.10118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882460.10160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882460.10167: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882460.10179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882460.10192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882460.10200: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882460.10206: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882460.10214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882460.10223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882460.10234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882460.10241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882460.10247: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882460.10259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882460.10334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882460.10349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882460.10352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882460.10731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882461.45078: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 13531 1726882461.45086: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "ina<<< 13531 1726882461.45122: stdout chunk (state=3): >>>ctive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13531 1726882461.46319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882461.46369: stderr chunk (state=3): >>><<< 13531 1726882461.46373: stdout chunk (state=3): >>><<< 13531 1726882461.46399: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882461.47351: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882459.9095557-15649-258045738963599/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882461.47359: _low_level_execute_command(): starting 13531 1726882461.47361: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882459.9095557-15649-258045738963599/ > /dev/null 2>&1 && sleep 0' 13531 1726882461.47578: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882461.47581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882461.47584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882461.47586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882461.47588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882461.47685: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882461.47688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882461.47691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882461.47693: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882461.47695: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882461.47697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882461.47699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882461.47701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882461.47702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882461.47704: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882461.47706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882461.47723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882461.47742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882461.47757: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882461.47886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882461.50350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882461.50376: stderr chunk (state=3): >>><<< 13531 1726882461.50378: stdout chunk (state=3): >>><<< 13531 1726882461.50474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882461.50477: handler run complete 13531 1726882461.50514: variable 'ansible_facts' from source: unknown 13531 1726882461.50622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882461.50876: variable 'ansible_facts' from source: unknown 13531 1726882461.50957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882461.51070: attempt loop complete, returning result 13531 1726882461.51074: _execute() done 13531 1726882461.51076: dumping result to json 13531 1726882461.51109: done dumping result, returning 13531 1726882461.51123: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4fd9-519d-000000000919] 13531 1726882461.51128: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000919 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882461.52047: no more pending results, returning what we have 13531 1726882461.52052: results queue empty 13531 1726882461.52055: checking for any_errors_fatal 13531 1726882461.52073: done checking for any_errors_fatal 13531 1726882461.52074: checking for max_fail_percentage 13531 1726882461.52076: done checking for max_fail_percentage 13531 1726882461.52077: checking to see if all hosts have failed and the running result is not ok 13531 1726882461.52078: done checking to see if all hosts have failed 13531 1726882461.52081: getting the remaining hosts for this loop 13531 1726882461.52082: done getting the remaining hosts for this loop 13531 1726882461.52089: getting the next task for host managed_node2 13531 1726882461.52101: done getting next task for host managed_node2 13531 1726882461.52109: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13531 1726882461.52117: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882461.52141: getting variables 13531 1726882461.52148: in VariableManager get_vars() 13531 1726882461.52291: Calling all_inventory to load vars for managed_node2 13531 1726882461.52294: Calling groups_inventory to load vars for managed_node2 13531 1726882461.52300: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882461.52387: Calling all_plugins_play to load vars for managed_node2 13531 1726882461.52394: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882461.52400: Calling groups_plugins_play to load vars for managed_node2 13531 1726882461.53295: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000919 13531 1726882461.53312: WORKER PROCESS EXITING 13531 1726882461.56137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882461.60689: done with get_vars() 13531 1726882461.60817: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:34:21 -0400 (0:00:01.756) 0:00:49.504 ****** 13531 1726882461.60930: entering _queue_task() for managed_node2/package_facts 13531 1726882461.61510: worker is 1 (out of 1 available) 13531 1726882461.61523: exiting _queue_task() for managed_node2/package_facts 13531 1726882461.61541: done queuing things up, now waiting for results queue to drain 13531 1726882461.61542: waiting for pending results... 13531 1726882461.62522: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 13531 1726882461.62750: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000091a 13531 1726882461.62768: variable 'ansible_search_path' from source: unknown 13531 1726882461.62772: variable 'ansible_search_path' from source: unknown 13531 1726882461.62836: calling self._execute() 13531 1726882461.62970: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882461.62974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882461.62984: variable 'omit' from source: magic vars 13531 1726882461.63439: variable 'ansible_distribution_major_version' from source: facts 13531 1726882461.63455: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882461.63472: variable 'omit' from source: magic vars 13531 1726882461.63567: variable 'omit' from source: magic vars 13531 1726882461.63609: variable 'omit' from source: magic vars 13531 1726882461.63654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882461.63711: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882461.63860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882461.63865: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882461.63869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882461.63872: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882461.63874: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882461.63877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882461.64072: Set connection var ansible_pipelining to False 13531 1726882461.64076: Set connection var ansible_timeout to 10 13531 1726882461.64079: Set connection var ansible_shell_executable to /bin/sh 13531 1726882461.64081: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882461.64083: Set connection var ansible_connection to ssh 13531 1726882461.64085: Set connection var ansible_shell_type to sh 13531 1726882461.64087: variable 'ansible_shell_executable' from source: unknown 13531 1726882461.64089: variable 'ansible_connection' from source: unknown 13531 1726882461.64091: variable 'ansible_module_compression' from source: unknown 13531 1726882461.64093: variable 'ansible_shell_type' from source: unknown 13531 1726882461.64095: variable 'ansible_shell_executable' from source: unknown 13531 1726882461.64097: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882461.64099: variable 'ansible_pipelining' from source: unknown 13531 1726882461.64101: variable 'ansible_timeout' from source: unknown 13531 1726882461.64103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882461.64278: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882461.64288: variable 'omit' from source: magic vars 13531 1726882461.64291: starting attempt loop 13531 1726882461.64294: running the handler 13531 1726882461.64308: _low_level_execute_command(): starting 13531 1726882461.64316: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882461.65781: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882461.65786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882461.65788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882461.65791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882461.65793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882461.65795: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882461.65797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882461.65800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882461.65802: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882461.65804: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882461.65806: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882461.65807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882461.65809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882461.65811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882461.65813: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882461.65815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882461.65817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882461.65818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882461.65820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882461.65823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882461.67687: stdout chunk (state=3): >>>/root <<< 13531 1726882461.67775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882461.67778: stdout chunk (state=3): >>><<< 13531 1726882461.67789: stderr chunk (state=3): >>><<< 13531 1726882461.67814: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882461.67828: _low_level_execute_command(): starting 13531 1726882461.67835: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882461.678139-15712-143303682868787 `" && echo ansible-tmp-1726882461.678139-15712-143303682868787="` echo /root/.ansible/tmp/ansible-tmp-1726882461.678139-15712-143303682868787 `" ) && sleep 0' 13531 1726882461.69969: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882461.69976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882461.70013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882461.70019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882461.70147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882461.70155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882461.70221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882461.70234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882461.70241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882461.70380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882461.72302: stdout chunk (state=3): >>>ansible-tmp-1726882461.678139-15712-143303682868787=/root/.ansible/tmp/ansible-tmp-1726882461.678139-15712-143303682868787 <<< 13531 1726882461.72461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882461.72469: stderr chunk (state=3): >>><<< 13531 1726882461.72472: stdout chunk (state=3): >>><<< 13531 1726882461.72494: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882461.678139-15712-143303682868787=/root/.ansible/tmp/ansible-tmp-1726882461.678139-15712-143303682868787 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882461.72543: variable 'ansible_module_compression' from source: unknown 13531 1726882461.72600: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13531 1726882461.72658: variable 'ansible_facts' from source: unknown 13531 1726882461.72849: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882461.678139-15712-143303682868787/AnsiballZ_package_facts.py 13531 1726882461.73790: Sending initial data 13531 1726882461.73794: Sent initial data (161 bytes) 13531 1726882461.76315: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882461.76319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882461.76366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882461.76376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882461.76496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882461.76500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882461.76571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882461.76586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882461.76592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882461.76727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882461.78549: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882461.78641: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882461.78744: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpm5ghv0r_ /root/.ansible/tmp/ansible-tmp-1726882461.678139-15712-143303682868787/AnsiballZ_package_facts.py <<< 13531 1726882461.78840: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882461.81974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882461.82048: stderr chunk (state=3): >>><<< 13531 1726882461.82052: stdout chunk (state=3): >>><<< 13531 1726882461.82078: done transferring module to remote 13531 1726882461.82090: _low_level_execute_command(): starting 13531 1726882461.82095: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882461.678139-15712-143303682868787/ /root/.ansible/tmp/ansible-tmp-1726882461.678139-15712-143303682868787/AnsiballZ_package_facts.py && sleep 0' 13531 1726882461.82992: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882461.83001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882461.83011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882461.83024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882461.83073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882461.83081: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882461.83091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882461.83104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882461.83112: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882461.83119: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882461.83127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882461.83136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882461.83156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882461.83162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882461.83172: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882461.83186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882461.83258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882461.83280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882461.83284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882461.83414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882461.85323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882461.85328: stdout chunk (state=3): >>><<< 13531 1726882461.85330: stderr chunk (state=3): >>><<< 13531 1726882461.85356: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882461.85360: _low_level_execute_command(): starting 13531 1726882461.85362: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882461.678139-15712-143303682868787/AnsiballZ_package_facts.py && sleep 0' 13531 1726882461.86012: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882461.86019: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882461.86032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882461.86041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882461.86083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882461.86086: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882461.86093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882461.86113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882461.86119: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882461.86125: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882461.86132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882461.86141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882461.86152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882461.86159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882461.86167: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882461.86176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882461.86249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882461.86277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882461.86281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882461.86420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882462.32899: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null,<<< 13531 1726882462.32952: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "<<< 13531 1726882462.32978: stdout chunk (state=3): >>>rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": <<< 13531 1726882462.32983: stdout chunk (state=3): >>>"7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1<<< 13531 1726882462.33011: stdout chunk (state=3): >>>.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch<<< 13531 1726882462.33029: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "<<< 13531 1726882462.33089: stdout chunk (state=3): >>>version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "re<<< 13531 1726882462.33096: stdout chunk (state=3): >>>lease": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", <<< 13531 1726882462.33099: stdout chunk (state=3): >>>"source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13531 1726882462.34552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882462.34624: stderr chunk (state=3): >>><<< 13531 1726882462.34627: stdout chunk (state=3): >>><<< 13531 1726882462.34680: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882462.36403: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882461.678139-15712-143303682868787/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882462.36419: _low_level_execute_command(): starting 13531 1726882462.36424: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882461.678139-15712-143303682868787/ > /dev/null 2>&1 && sleep 0' 13531 1726882462.36916: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882462.36920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882462.36955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882462.36970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882462.36981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882462.37032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882462.37036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882462.37151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882462.38995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882462.39057: stderr chunk (state=3): >>><<< 13531 1726882462.39060: stdout chunk (state=3): >>><<< 13531 1726882462.39073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882462.39079: handler run complete 13531 1726882462.39809: variable 'ansible_facts' from source: unknown 13531 1726882462.40331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882462.42344: variable 'ansible_facts' from source: unknown 13531 1726882462.42817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882462.43605: attempt loop complete, returning result 13531 1726882462.43617: _execute() done 13531 1726882462.43620: dumping result to json 13531 1726882462.43875: done dumping result, returning 13531 1726882462.43878: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4fd9-519d-00000000091a] 13531 1726882462.43881: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000091a 13531 1726882462.45596: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000091a 13531 1726882462.45600: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882462.45795: no more pending results, returning what we have 13531 1726882462.45798: results queue empty 13531 1726882462.45799: checking for any_errors_fatal 13531 1726882462.45804: done checking for any_errors_fatal 13531 1726882462.45804: checking for max_fail_percentage 13531 1726882462.45806: done checking for max_fail_percentage 13531 1726882462.45807: checking to see if all hosts have failed and the running result is not ok 13531 1726882462.45807: done checking to see if all hosts have failed 13531 1726882462.45808: getting the remaining hosts for this loop 13531 1726882462.45809: done getting the remaining hosts for this loop 13531 1726882462.45813: getting the next task for host managed_node2 13531 1726882462.45820: done getting next task for host managed_node2 13531 1726882462.45823: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13531 1726882462.45828: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882462.45839: getting variables 13531 1726882462.45840: in VariableManager get_vars() 13531 1726882462.45884: Calling all_inventory to load vars for managed_node2 13531 1726882462.45887: Calling groups_inventory to load vars for managed_node2 13531 1726882462.45889: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882462.45898: Calling all_plugins_play to load vars for managed_node2 13531 1726882462.45901: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882462.45904: Calling groups_plugins_play to load vars for managed_node2 13531 1726882462.47638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882462.51787: done with get_vars() 13531 1726882462.51823: done getting variables 13531 1726882462.51909: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:34:22 -0400 (0:00:00.910) 0:00:50.414 ****** 13531 1726882462.51954: entering _queue_task() for managed_node2/debug 13531 1726882462.52307: worker is 1 (out of 1 available) 13531 1726882462.52328: exiting _queue_task() for managed_node2/debug 13531 1726882462.52342: done queuing things up, now waiting for results queue to drain 13531 1726882462.52344: waiting for pending results... 13531 1726882462.52641: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 13531 1726882462.52755: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000016d 13531 1726882462.52772: variable 'ansible_search_path' from source: unknown 13531 1726882462.52776: variable 'ansible_search_path' from source: unknown 13531 1726882462.52806: calling self._execute() 13531 1726882462.52888: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882462.52893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882462.52902: variable 'omit' from source: magic vars 13531 1726882462.53190: variable 'ansible_distribution_major_version' from source: facts 13531 1726882462.53199: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882462.53204: variable 'omit' from source: magic vars 13531 1726882462.53252: variable 'omit' from source: magic vars 13531 1726882462.53327: variable 'network_provider' from source: set_fact 13531 1726882462.53340: variable 'omit' from source: magic vars 13531 1726882462.53381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882462.53407: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882462.53424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882462.53438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882462.53448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882462.53475: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882462.53479: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882462.53482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882462.53555: Set connection var ansible_pipelining to False 13531 1726882462.53564: Set connection var ansible_timeout to 10 13531 1726882462.53569: Set connection var ansible_shell_executable to /bin/sh 13531 1726882462.53574: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882462.53577: Set connection var ansible_connection to ssh 13531 1726882462.53579: Set connection var ansible_shell_type to sh 13531 1726882462.53601: variable 'ansible_shell_executable' from source: unknown 13531 1726882462.53604: variable 'ansible_connection' from source: unknown 13531 1726882462.53607: variable 'ansible_module_compression' from source: unknown 13531 1726882462.53609: variable 'ansible_shell_type' from source: unknown 13531 1726882462.53611: variable 'ansible_shell_executable' from source: unknown 13531 1726882462.53613: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882462.53621: variable 'ansible_pipelining' from source: unknown 13531 1726882462.53623: variable 'ansible_timeout' from source: unknown 13531 1726882462.53625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882462.53743: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882462.53749: variable 'omit' from source: magic vars 13531 1726882462.53755: starting attempt loop 13531 1726882462.53760: running the handler 13531 1726882462.54069: handler run complete 13531 1726882462.54073: attempt loop complete, returning result 13531 1726882462.54075: _execute() done 13531 1726882462.54077: dumping result to json 13531 1726882462.54078: done dumping result, returning 13531 1726882462.54080: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4fd9-519d-00000000016d] 13531 1726882462.54082: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000016d 13531 1726882462.54140: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000016d 13531 1726882462.54144: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 13531 1726882462.54206: no more pending results, returning what we have 13531 1726882462.54210: results queue empty 13531 1726882462.54211: checking for any_errors_fatal 13531 1726882462.54218: done checking for any_errors_fatal 13531 1726882462.54219: checking for max_fail_percentage 13531 1726882462.54220: done checking for max_fail_percentage 13531 1726882462.54221: checking to see if all hosts have failed and the running result is not ok 13531 1726882462.54222: done checking to see if all hosts have failed 13531 1726882462.54223: getting the remaining hosts for this loop 13531 1726882462.54224: done getting the remaining hosts for this loop 13531 1726882462.54228: getting the next task for host managed_node2 13531 1726882462.54234: done getting next task for host managed_node2 13531 1726882462.54238: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13531 1726882462.54242: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882462.54254: getting variables 13531 1726882462.54255: in VariableManager get_vars() 13531 1726882462.54309: Calling all_inventory to load vars for managed_node2 13531 1726882462.54312: Calling groups_inventory to load vars for managed_node2 13531 1726882462.54315: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882462.54324: Calling all_plugins_play to load vars for managed_node2 13531 1726882462.54327: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882462.54330: Calling groups_plugins_play to load vars for managed_node2 13531 1726882462.55960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882462.57032: done with get_vars() 13531 1726882462.57054: done getting variables 13531 1726882462.57103: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:34:22 -0400 (0:00:00.051) 0:00:50.466 ****** 13531 1726882462.57133: entering _queue_task() for managed_node2/fail 13531 1726882462.57380: worker is 1 (out of 1 available) 13531 1726882462.57392: exiting _queue_task() for managed_node2/fail 13531 1726882462.57405: done queuing things up, now waiting for results queue to drain 13531 1726882462.57406: waiting for pending results... 13531 1726882462.57630: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13531 1726882462.57731: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000016e 13531 1726882462.57766: variable 'ansible_search_path' from source: unknown 13531 1726882462.57769: variable 'ansible_search_path' from source: unknown 13531 1726882462.57836: calling self._execute() 13531 1726882462.58076: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882462.58081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882462.58084: variable 'omit' from source: magic vars 13531 1726882462.58342: variable 'ansible_distribution_major_version' from source: facts 13531 1726882462.58385: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882462.59402: variable 'network_state' from source: role '' defaults 13531 1726882462.59413: Evaluated conditional (network_state != {}): False 13531 1726882462.59416: when evaluation is False, skipping this task 13531 1726882462.59419: _execute() done 13531 1726882462.59422: dumping result to json 13531 1726882462.59425: done dumping result, returning 13531 1726882462.59434: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4fd9-519d-00000000016e] 13531 1726882462.59439: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000016e 13531 1726882462.59751: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000016e 13531 1726882462.59754: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882462.59806: no more pending results, returning what we have 13531 1726882462.59810: results queue empty 13531 1726882462.59811: checking for any_errors_fatal 13531 1726882462.59819: done checking for any_errors_fatal 13531 1726882462.59819: checking for max_fail_percentage 13531 1726882462.59821: done checking for max_fail_percentage 13531 1726882462.59822: checking to see if all hosts have failed and the running result is not ok 13531 1726882462.59823: done checking to see if all hosts have failed 13531 1726882462.59823: getting the remaining hosts for this loop 13531 1726882462.59825: done getting the remaining hosts for this loop 13531 1726882462.59828: getting the next task for host managed_node2 13531 1726882462.59835: done getting next task for host managed_node2 13531 1726882462.59839: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13531 1726882462.59844: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882462.59869: getting variables 13531 1726882462.59872: in VariableManager get_vars() 13531 1726882462.59920: Calling all_inventory to load vars for managed_node2 13531 1726882462.59923: Calling groups_inventory to load vars for managed_node2 13531 1726882462.59926: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882462.59935: Calling all_plugins_play to load vars for managed_node2 13531 1726882462.59938: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882462.59940: Calling groups_plugins_play to load vars for managed_node2 13531 1726882462.62707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882462.66630: done with get_vars() 13531 1726882462.66672: done getting variables 13531 1726882462.66856: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:34:22 -0400 (0:00:00.097) 0:00:50.564 ****** 13531 1726882462.67014: entering _queue_task() for managed_node2/fail 13531 1726882462.67730: worker is 1 (out of 1 available) 13531 1726882462.67743: exiting _queue_task() for managed_node2/fail 13531 1726882462.67757: done queuing things up, now waiting for results queue to drain 13531 1726882462.67759: waiting for pending results... 13531 1726882462.69065: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13531 1726882462.69213: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000016f 13531 1726882462.69222: variable 'ansible_search_path' from source: unknown 13531 1726882462.69225: variable 'ansible_search_path' from source: unknown 13531 1726882462.69270: calling self._execute() 13531 1726882462.69372: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882462.69377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882462.69390: variable 'omit' from source: magic vars 13531 1726882462.70057: variable 'ansible_distribution_major_version' from source: facts 13531 1726882462.70074: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882462.70187: variable 'network_state' from source: role '' defaults 13531 1726882462.70198: Evaluated conditional (network_state != {}): False 13531 1726882462.70201: when evaluation is False, skipping this task 13531 1726882462.70205: _execute() done 13531 1726882462.70207: dumping result to json 13531 1726882462.70210: done dumping result, returning 13531 1726882462.70216: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4fd9-519d-00000000016f] 13531 1726882462.70222: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000016f 13531 1726882462.70322: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000016f 13531 1726882462.70325: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882462.70378: no more pending results, returning what we have 13531 1726882462.70382: results queue empty 13531 1726882462.70383: checking for any_errors_fatal 13531 1726882462.70391: done checking for any_errors_fatal 13531 1726882462.70392: checking for max_fail_percentage 13531 1726882462.70394: done checking for max_fail_percentage 13531 1726882462.70395: checking to see if all hosts have failed and the running result is not ok 13531 1726882462.70395: done checking to see if all hosts have failed 13531 1726882462.70396: getting the remaining hosts for this loop 13531 1726882462.70398: done getting the remaining hosts for this loop 13531 1726882462.70401: getting the next task for host managed_node2 13531 1726882462.70407: done getting next task for host managed_node2 13531 1726882462.70411: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13531 1726882462.70416: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882462.70440: getting variables 13531 1726882462.70442: in VariableManager get_vars() 13531 1726882462.70492: Calling all_inventory to load vars for managed_node2 13531 1726882462.70495: Calling groups_inventory to load vars for managed_node2 13531 1726882462.70497: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882462.70506: Calling all_plugins_play to load vars for managed_node2 13531 1726882462.70509: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882462.70511: Calling groups_plugins_play to load vars for managed_node2 13531 1726882462.73603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882462.77467: done with get_vars() 13531 1726882462.77505: done getting variables 13531 1726882462.77571: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:34:22 -0400 (0:00:00.107) 0:00:50.671 ****** 13531 1726882462.77606: entering _queue_task() for managed_node2/fail 13531 1726882462.77950: worker is 1 (out of 1 available) 13531 1726882462.77967: exiting _queue_task() for managed_node2/fail 13531 1726882462.77979: done queuing things up, now waiting for results queue to drain 13531 1726882462.77981: waiting for pending results... 13531 1726882462.78287: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13531 1726882462.78445: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000170 13531 1726882462.78463: variable 'ansible_search_path' from source: unknown 13531 1726882462.78468: variable 'ansible_search_path' from source: unknown 13531 1726882462.78505: calling self._execute() 13531 1726882462.78607: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882462.78611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882462.78621: variable 'omit' from source: magic vars 13531 1726882462.79000: variable 'ansible_distribution_major_version' from source: facts 13531 1726882462.79012: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882462.79267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882462.83087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882462.83175: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882462.83228: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882462.83271: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882462.83309: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882462.83401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882462.83462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882462.83500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882462.83559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882462.83581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882462.83697: variable 'ansible_distribution_major_version' from source: facts 13531 1726882462.83719: Evaluated conditional (ansible_distribution_major_version | int > 9): False 13531 1726882462.83727: when evaluation is False, skipping this task 13531 1726882462.83739: _execute() done 13531 1726882462.83751: dumping result to json 13531 1726882462.83758: done dumping result, returning 13531 1726882462.83773: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4fd9-519d-000000000170] 13531 1726882462.83782: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000170 13531 1726882462.83902: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000170 13531 1726882462.83909: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 13531 1726882462.83972: no more pending results, returning what we have 13531 1726882462.83976: results queue empty 13531 1726882462.83977: checking for any_errors_fatal 13531 1726882462.83983: done checking for any_errors_fatal 13531 1726882462.83984: checking for max_fail_percentage 13531 1726882462.83986: done checking for max_fail_percentage 13531 1726882462.83986: checking to see if all hosts have failed and the running result is not ok 13531 1726882462.83987: done checking to see if all hosts have failed 13531 1726882462.83987: getting the remaining hosts for this loop 13531 1726882462.83989: done getting the remaining hosts for this loop 13531 1726882462.83992: getting the next task for host managed_node2 13531 1726882462.83999: done getting next task for host managed_node2 13531 1726882462.84004: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13531 1726882462.84008: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882462.84029: getting variables 13531 1726882462.84031: in VariableManager get_vars() 13531 1726882462.84089: Calling all_inventory to load vars for managed_node2 13531 1726882462.84092: Calling groups_inventory to load vars for managed_node2 13531 1726882462.84094: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882462.84105: Calling all_plugins_play to load vars for managed_node2 13531 1726882462.84108: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882462.84111: Calling groups_plugins_play to load vars for managed_node2 13531 1726882462.85486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882462.86660: done with get_vars() 13531 1726882462.86701: done getting variables 13531 1726882462.86745: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:34:22 -0400 (0:00:00.091) 0:00:50.763 ****** 13531 1726882462.86802: entering _queue_task() for managed_node2/dnf 13531 1726882462.87128: worker is 1 (out of 1 available) 13531 1726882462.87140: exiting _queue_task() for managed_node2/dnf 13531 1726882462.87152: done queuing things up, now waiting for results queue to drain 13531 1726882462.87155: waiting for pending results... 13531 1726882462.87818: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13531 1726882462.87831: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000171 13531 1726882462.87835: variable 'ansible_search_path' from source: unknown 13531 1726882462.87837: variable 'ansible_search_path' from source: unknown 13531 1726882462.87841: calling self._execute() 13531 1726882462.87843: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882462.87846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882462.87848: variable 'omit' from source: magic vars 13531 1726882462.88247: variable 'ansible_distribution_major_version' from source: facts 13531 1726882462.88250: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882462.88443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882462.90721: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882462.90774: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882462.90802: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882462.90827: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882462.90847: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882462.90913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882462.90933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882462.90950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882462.90985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882462.90997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882462.91088: variable 'ansible_distribution' from source: facts 13531 1726882462.91091: variable 'ansible_distribution_major_version' from source: facts 13531 1726882462.91105: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13531 1726882462.91191: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882462.91282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882462.91300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882462.91320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882462.91345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882462.91356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882462.91388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882462.91407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882462.91424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882462.91448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882462.91460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882462.91489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882462.91505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882462.91526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882462.91552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882462.91566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882462.91692: variable 'network_connections' from source: task vars 13531 1726882462.91701: variable 'controller_profile' from source: play vars 13531 1726882462.91752: variable 'controller_profile' from source: play vars 13531 1726882462.91803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882462.91985: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882462.92020: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882462.92099: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882462.92131: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882462.92192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882462.92223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882462.92261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882462.92311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882462.92369: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882462.92647: variable 'network_connections' from source: task vars 13531 1726882462.92658: variable 'controller_profile' from source: play vars 13531 1726882462.92749: variable 'controller_profile' from source: play vars 13531 1726882462.92794: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882462.92803: when evaluation is False, skipping this task 13531 1726882462.92836: _execute() done 13531 1726882462.92862: dumping result to json 13531 1726882462.92883: done dumping result, returning 13531 1726882462.92911: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-000000000171] 13531 1726882462.92929: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000171 13531 1726882462.93080: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000171 13531 1726882462.93083: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882462.93130: no more pending results, returning what we have 13531 1726882462.93134: results queue empty 13531 1726882462.93134: checking for any_errors_fatal 13531 1726882462.93141: done checking for any_errors_fatal 13531 1726882462.93142: checking for max_fail_percentage 13531 1726882462.93144: done checking for max_fail_percentage 13531 1726882462.93145: checking to see if all hosts have failed and the running result is not ok 13531 1726882462.93146: done checking to see if all hosts have failed 13531 1726882462.93146: getting the remaining hosts for this loop 13531 1726882462.93148: done getting the remaining hosts for this loop 13531 1726882462.93151: getting the next task for host managed_node2 13531 1726882462.93158: done getting next task for host managed_node2 13531 1726882462.93162: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13531 1726882462.93167: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882462.93190: getting variables 13531 1726882462.93192: in VariableManager get_vars() 13531 1726882462.93242: Calling all_inventory to load vars for managed_node2 13531 1726882462.93244: Calling groups_inventory to load vars for managed_node2 13531 1726882462.93247: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882462.93257: Calling all_plugins_play to load vars for managed_node2 13531 1726882462.93259: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882462.93262: Calling groups_plugins_play to load vars for managed_node2 13531 1726882462.94268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882462.95234: done with get_vars() 13531 1726882462.95252: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13531 1726882462.95313: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:34:22 -0400 (0:00:00.085) 0:00:50.848 ****** 13531 1726882462.95338: entering _queue_task() for managed_node2/yum 13531 1726882462.95586: worker is 1 (out of 1 available) 13531 1726882462.95599: exiting _queue_task() for managed_node2/yum 13531 1726882462.95613: done queuing things up, now waiting for results queue to drain 13531 1726882462.95614: waiting for pending results... 13531 1726882462.95802: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13531 1726882462.95909: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000172 13531 1726882462.95919: variable 'ansible_search_path' from source: unknown 13531 1726882462.95923: variable 'ansible_search_path' from source: unknown 13531 1726882462.95953: calling self._execute() 13531 1726882462.96029: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882462.96032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882462.96041: variable 'omit' from source: magic vars 13531 1726882462.96337: variable 'ansible_distribution_major_version' from source: facts 13531 1726882462.96379: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882462.96579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882462.98975: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882462.99044: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882462.99080: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882462.99121: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882462.99161: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882462.99257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882462.99304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882462.99340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882462.99398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882462.99426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882462.99537: variable 'ansible_distribution_major_version' from source: facts 13531 1726882462.99552: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13531 1726882462.99555: when evaluation is False, skipping this task 13531 1726882462.99558: _execute() done 13531 1726882462.99562: dumping result to json 13531 1726882462.99566: done dumping result, returning 13531 1726882462.99575: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-000000000172] 13531 1726882462.99580: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000172 13531 1726882462.99721: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000172 13531 1726882462.99724: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13531 1726882462.99780: no more pending results, returning what we have 13531 1726882462.99784: results queue empty 13531 1726882462.99785: checking for any_errors_fatal 13531 1726882462.99792: done checking for any_errors_fatal 13531 1726882462.99793: checking for max_fail_percentage 13531 1726882462.99795: done checking for max_fail_percentage 13531 1726882462.99796: checking to see if all hosts have failed and the running result is not ok 13531 1726882462.99796: done checking to see if all hosts have failed 13531 1726882462.99797: getting the remaining hosts for this loop 13531 1726882462.99799: done getting the remaining hosts for this loop 13531 1726882462.99802: getting the next task for host managed_node2 13531 1726882462.99809: done getting next task for host managed_node2 13531 1726882462.99813: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13531 1726882462.99816: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882462.99856: getting variables 13531 1726882462.99858: in VariableManager get_vars() 13531 1726882462.99965: Calling all_inventory to load vars for managed_node2 13531 1726882462.99971: Calling groups_inventory to load vars for managed_node2 13531 1726882462.99974: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882462.99986: Calling all_plugins_play to load vars for managed_node2 13531 1726882462.99993: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882462.99997: Calling groups_plugins_play to load vars for managed_node2 13531 1726882463.01300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882463.02578: done with get_vars() 13531 1726882463.02596: done getting variables 13531 1726882463.02643: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:34:23 -0400 (0:00:00.073) 0:00:50.921 ****** 13531 1726882463.02675: entering _queue_task() for managed_node2/fail 13531 1726882463.02926: worker is 1 (out of 1 available) 13531 1726882463.02939: exiting _queue_task() for managed_node2/fail 13531 1726882463.02952: done queuing things up, now waiting for results queue to drain 13531 1726882463.02956: waiting for pending results... 13531 1726882463.03149: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13531 1726882463.03250: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000173 13531 1726882463.03266: variable 'ansible_search_path' from source: unknown 13531 1726882463.03270: variable 'ansible_search_path' from source: unknown 13531 1726882463.03302: calling self._execute() 13531 1726882463.03379: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882463.03384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882463.03393: variable 'omit' from source: magic vars 13531 1726882463.03677: variable 'ansible_distribution_major_version' from source: facts 13531 1726882463.03687: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882463.03774: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882463.03916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882463.05577: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882463.05622: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882463.05650: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882463.05744: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882463.05747: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882463.05830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.05878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.05905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.05950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.05966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.06036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.06068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.06094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.06122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.06140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.06201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.06231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.06270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.06295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.06306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.06480: variable 'network_connections' from source: task vars 13531 1726882463.06497: variable 'controller_profile' from source: play vars 13531 1726882463.06562: variable 'controller_profile' from source: play vars 13531 1726882463.06625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882463.06736: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882463.06768: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882463.06792: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882463.06814: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882463.06869: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882463.06892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882463.06936: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.06949: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882463.07023: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882463.07209: variable 'network_connections' from source: task vars 13531 1726882463.07214: variable 'controller_profile' from source: play vars 13531 1726882463.07257: variable 'controller_profile' from source: play vars 13531 1726882463.07281: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882463.07284: when evaluation is False, skipping this task 13531 1726882463.07287: _execute() done 13531 1726882463.07289: dumping result to json 13531 1726882463.07292: done dumping result, returning 13531 1726882463.07300: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-000000000173] 13531 1726882463.07307: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000173 13531 1726882463.07403: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000173 13531 1726882463.07405: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882463.07460: no more pending results, returning what we have 13531 1726882463.07467: results queue empty 13531 1726882463.07468: checking for any_errors_fatal 13531 1726882463.07474: done checking for any_errors_fatal 13531 1726882463.07475: checking for max_fail_percentage 13531 1726882463.07477: done checking for max_fail_percentage 13531 1726882463.07477: checking to see if all hosts have failed and the running result is not ok 13531 1726882463.07478: done checking to see if all hosts have failed 13531 1726882463.07479: getting the remaining hosts for this loop 13531 1726882463.07480: done getting the remaining hosts for this loop 13531 1726882463.07487: getting the next task for host managed_node2 13531 1726882463.07498: done getting next task for host managed_node2 13531 1726882463.07502: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13531 1726882463.07506: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882463.07527: getting variables 13531 1726882463.07529: in VariableManager get_vars() 13531 1726882463.07583: Calling all_inventory to load vars for managed_node2 13531 1726882463.07586: Calling groups_inventory to load vars for managed_node2 13531 1726882463.07589: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882463.07606: Calling all_plugins_play to load vars for managed_node2 13531 1726882463.07609: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882463.07612: Calling groups_plugins_play to load vars for managed_node2 13531 1726882463.08676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882463.09873: done with get_vars() 13531 1726882463.09898: done getting variables 13531 1726882463.09944: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:34:23 -0400 (0:00:00.072) 0:00:50.994 ****** 13531 1726882463.09976: entering _queue_task() for managed_node2/package 13531 1726882463.10230: worker is 1 (out of 1 available) 13531 1726882463.10243: exiting _queue_task() for managed_node2/package 13531 1726882463.10259: done queuing things up, now waiting for results queue to drain 13531 1726882463.10260: waiting for pending results... 13531 1726882463.10452: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 13531 1726882463.10583: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000174 13531 1726882463.10594: variable 'ansible_search_path' from source: unknown 13531 1726882463.10598: variable 'ansible_search_path' from source: unknown 13531 1726882463.10633: calling self._execute() 13531 1726882463.10714: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882463.10723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882463.10729: variable 'omit' from source: magic vars 13531 1726882463.11011: variable 'ansible_distribution_major_version' from source: facts 13531 1726882463.11021: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882463.11170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882463.11413: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882463.11464: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882463.11494: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882463.11577: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882463.11689: variable 'network_packages' from source: role '' defaults 13531 1726882463.11793: variable '__network_provider_setup' from source: role '' defaults 13531 1726882463.11820: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882463.11873: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882463.11881: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882463.11924: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882463.12053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882463.13837: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882463.13890: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882463.13930: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882463.13962: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882463.13987: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882463.14058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.14082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.14102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.14138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.14148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.14206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.14237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.14254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.14296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.14309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.14480: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13531 1726882463.14551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.14571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.14590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.14616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.14626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.14701: variable 'ansible_python' from source: facts 13531 1726882463.14721: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13531 1726882463.14781: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882463.14837: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882463.14923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.14940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.14958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.15004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.15013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.15055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.15090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.15108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.15135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.15146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.15281: variable 'network_connections' from source: task vars 13531 1726882463.15284: variable 'controller_profile' from source: play vars 13531 1726882463.15365: variable 'controller_profile' from source: play vars 13531 1726882463.15447: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882463.15493: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882463.15517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.15542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882463.15610: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882463.15811: variable 'network_connections' from source: task vars 13531 1726882463.15814: variable 'controller_profile' from source: play vars 13531 1726882463.15904: variable 'controller_profile' from source: play vars 13531 1726882463.15927: variable '__network_packages_default_wireless' from source: role '' defaults 13531 1726882463.16010: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882463.16296: variable 'network_connections' from source: task vars 13531 1726882463.16299: variable 'controller_profile' from source: play vars 13531 1726882463.16346: variable 'controller_profile' from source: play vars 13531 1726882463.16366: variable '__network_packages_default_team' from source: role '' defaults 13531 1726882463.16421: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882463.16694: variable 'network_connections' from source: task vars 13531 1726882463.16698: variable 'controller_profile' from source: play vars 13531 1726882463.16754: variable 'controller_profile' from source: play vars 13531 1726882463.16824: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882463.16870: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882463.16874: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882463.16920: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882463.17081: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13531 1726882463.17527: variable 'network_connections' from source: task vars 13531 1726882463.17531: variable 'controller_profile' from source: play vars 13531 1726882463.17581: variable 'controller_profile' from source: play vars 13531 1726882463.17585: variable 'ansible_distribution' from source: facts 13531 1726882463.17589: variable '__network_rh_distros' from source: role '' defaults 13531 1726882463.17601: variable 'ansible_distribution_major_version' from source: facts 13531 1726882463.17612: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13531 1726882463.17731: variable 'ansible_distribution' from source: facts 13531 1726882463.17735: variable '__network_rh_distros' from source: role '' defaults 13531 1726882463.17738: variable 'ansible_distribution_major_version' from source: facts 13531 1726882463.17750: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13531 1726882463.17867: variable 'ansible_distribution' from source: facts 13531 1726882463.17873: variable '__network_rh_distros' from source: role '' defaults 13531 1726882463.17875: variable 'ansible_distribution_major_version' from source: facts 13531 1726882463.17915: variable 'network_provider' from source: set_fact 13531 1726882463.17927: variable 'ansible_facts' from source: unknown 13531 1726882463.18374: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13531 1726882463.18377: when evaluation is False, skipping this task 13531 1726882463.18380: _execute() done 13531 1726882463.18382: dumping result to json 13531 1726882463.18384: done dumping result, returning 13531 1726882463.18392: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4fd9-519d-000000000174] 13531 1726882463.18398: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000174 13531 1726882463.18504: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000174 13531 1726882463.18510: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13531 1726882463.18638: no more pending results, returning what we have 13531 1726882463.18657: results queue empty 13531 1726882463.18658: checking for any_errors_fatal 13531 1726882463.18686: done checking for any_errors_fatal 13531 1726882463.18693: checking for max_fail_percentage 13531 1726882463.18700: done checking for max_fail_percentage 13531 1726882463.18702: checking to see if all hosts have failed and the running result is not ok 13531 1726882463.18703: done checking to see if all hosts have failed 13531 1726882463.18703: getting the remaining hosts for this loop 13531 1726882463.18706: done getting the remaining hosts for this loop 13531 1726882463.18720: getting the next task for host managed_node2 13531 1726882463.18733: done getting next task for host managed_node2 13531 1726882463.18747: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13531 1726882463.18766: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882463.18821: getting variables 13531 1726882463.18824: in VariableManager get_vars() 13531 1726882463.19144: Calling all_inventory to load vars for managed_node2 13531 1726882463.19176: Calling groups_inventory to load vars for managed_node2 13531 1726882463.19248: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882463.19298: Calling all_plugins_play to load vars for managed_node2 13531 1726882463.19301: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882463.19305: Calling groups_plugins_play to load vars for managed_node2 13531 1726882463.23358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882463.24414: done with get_vars() 13531 1726882463.24436: done getting variables 13531 1726882463.24489: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:34:23 -0400 (0:00:00.145) 0:00:51.140 ****** 13531 1726882463.24515: entering _queue_task() for managed_node2/package 13531 1726882463.24763: worker is 1 (out of 1 available) 13531 1726882463.24780: exiting _queue_task() for managed_node2/package 13531 1726882463.24791: done queuing things up, now waiting for results queue to drain 13531 1726882463.24793: waiting for pending results... 13531 1726882463.24985: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13531 1726882463.25097: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000175 13531 1726882463.25108: variable 'ansible_search_path' from source: unknown 13531 1726882463.25111: variable 'ansible_search_path' from source: unknown 13531 1726882463.25149: calling self._execute() 13531 1726882463.25226: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882463.25231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882463.25239: variable 'omit' from source: magic vars 13531 1726882463.25518: variable 'ansible_distribution_major_version' from source: facts 13531 1726882463.25528: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882463.25615: variable 'network_state' from source: role '' defaults 13531 1726882463.25624: Evaluated conditional (network_state != {}): False 13531 1726882463.25627: when evaluation is False, skipping this task 13531 1726882463.25629: _execute() done 13531 1726882463.25632: dumping result to json 13531 1726882463.25634: done dumping result, returning 13531 1726882463.25642: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4fd9-519d-000000000175] 13531 1726882463.25648: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000175 13531 1726882463.25741: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000175 13531 1726882463.25744: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882463.25798: no more pending results, returning what we have 13531 1726882463.25802: results queue empty 13531 1726882463.25803: checking for any_errors_fatal 13531 1726882463.25811: done checking for any_errors_fatal 13531 1726882463.25812: checking for max_fail_percentage 13531 1726882463.25813: done checking for max_fail_percentage 13531 1726882463.25815: checking to see if all hosts have failed and the running result is not ok 13531 1726882463.25816: done checking to see if all hosts have failed 13531 1726882463.25816: getting the remaining hosts for this loop 13531 1726882463.25818: done getting the remaining hosts for this loop 13531 1726882463.25820: getting the next task for host managed_node2 13531 1726882463.25827: done getting next task for host managed_node2 13531 1726882463.25831: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13531 1726882463.25835: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882463.25857: getting variables 13531 1726882463.25858: in VariableManager get_vars() 13531 1726882463.25912: Calling all_inventory to load vars for managed_node2 13531 1726882463.25915: Calling groups_inventory to load vars for managed_node2 13531 1726882463.25917: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882463.25926: Calling all_plugins_play to load vars for managed_node2 13531 1726882463.25929: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882463.25931: Calling groups_plugins_play to load vars for managed_node2 13531 1726882463.27095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882463.28947: done with get_vars() 13531 1726882463.28976: done getting variables 13531 1726882463.29038: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:34:23 -0400 (0:00:00.045) 0:00:51.185 ****** 13531 1726882463.29079: entering _queue_task() for managed_node2/package 13531 1726882463.29408: worker is 1 (out of 1 available) 13531 1726882463.29419: exiting _queue_task() for managed_node2/package 13531 1726882463.29434: done queuing things up, now waiting for results queue to drain 13531 1726882463.29435: waiting for pending results... 13531 1726882463.29728: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13531 1726882463.29889: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000176 13531 1726882463.29911: variable 'ansible_search_path' from source: unknown 13531 1726882463.29919: variable 'ansible_search_path' from source: unknown 13531 1726882463.29966: calling self._execute() 13531 1726882463.30069: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882463.30082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882463.30100: variable 'omit' from source: magic vars 13531 1726882463.30492: variable 'ansible_distribution_major_version' from source: facts 13531 1726882463.30511: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882463.30640: variable 'network_state' from source: role '' defaults 13531 1726882463.30658: Evaluated conditional (network_state != {}): False 13531 1726882463.30669: when evaluation is False, skipping this task 13531 1726882463.30677: _execute() done 13531 1726882463.30684: dumping result to json 13531 1726882463.30690: done dumping result, returning 13531 1726882463.30702: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4fd9-519d-000000000176] 13531 1726882463.30714: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000176 13531 1726882463.30833: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000176 13531 1726882463.30841: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882463.30905: no more pending results, returning what we have 13531 1726882463.30909: results queue empty 13531 1726882463.30910: checking for any_errors_fatal 13531 1726882463.30917: done checking for any_errors_fatal 13531 1726882463.30918: checking for max_fail_percentage 13531 1726882463.30920: done checking for max_fail_percentage 13531 1726882463.30921: checking to see if all hosts have failed and the running result is not ok 13531 1726882463.30922: done checking to see if all hosts have failed 13531 1726882463.30923: getting the remaining hosts for this loop 13531 1726882463.30925: done getting the remaining hosts for this loop 13531 1726882463.30928: getting the next task for host managed_node2 13531 1726882463.30936: done getting next task for host managed_node2 13531 1726882463.30941: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13531 1726882463.30946: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882463.30975: getting variables 13531 1726882463.30977: in VariableManager get_vars() 13531 1726882463.31033: Calling all_inventory to load vars for managed_node2 13531 1726882463.31036: Calling groups_inventory to load vars for managed_node2 13531 1726882463.31039: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882463.31052: Calling all_plugins_play to load vars for managed_node2 13531 1726882463.31058: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882463.31061: Calling groups_plugins_play to load vars for managed_node2 13531 1726882463.37980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882463.39650: done with get_vars() 13531 1726882463.39678: done getting variables 13531 1726882463.39723: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:34:23 -0400 (0:00:00.106) 0:00:51.292 ****** 13531 1726882463.39757: entering _queue_task() for managed_node2/service 13531 1726882463.40092: worker is 1 (out of 1 available) 13531 1726882463.40105: exiting _queue_task() for managed_node2/service 13531 1726882463.40117: done queuing things up, now waiting for results queue to drain 13531 1726882463.40119: waiting for pending results... 13531 1726882463.40421: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13531 1726882463.40589: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000177 13531 1726882463.40610: variable 'ansible_search_path' from source: unknown 13531 1726882463.40617: variable 'ansible_search_path' from source: unknown 13531 1726882463.40662: calling self._execute() 13531 1726882463.40762: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882463.40781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882463.40796: variable 'omit' from source: magic vars 13531 1726882463.41183: variable 'ansible_distribution_major_version' from source: facts 13531 1726882463.41204: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882463.41433: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882463.41637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882463.44298: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882463.44378: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882463.44413: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882463.44453: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882463.44479: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882463.44561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.44591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.44616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.44665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.44685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.44730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.44760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.44784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.44822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.44835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.44881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.44902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.46161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.46205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.46224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.46420: variable 'network_connections' from source: task vars 13531 1726882463.46439: variable 'controller_profile' from source: play vars 13531 1726882463.46509: variable 'controller_profile' from source: play vars 13531 1726882463.46590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882463.47277: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882463.47434: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882463.47466: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882463.47494: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882463.47657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882463.47677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882463.47703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.47724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882463.47891: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882463.48495: variable 'network_connections' from source: task vars 13531 1726882463.48506: variable 'controller_profile' from source: play vars 13531 1726882463.48571: variable 'controller_profile' from source: play vars 13531 1726882463.48707: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13531 1726882463.48711: when evaluation is False, skipping this task 13531 1726882463.48719: _execute() done 13531 1726882463.48722: dumping result to json 13531 1726882463.48724: done dumping result, returning 13531 1726882463.48733: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4fd9-519d-000000000177] 13531 1726882463.48739: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000177 13531 1726882463.48845: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000177 13531 1726882463.48858: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13531 1726882463.48912: no more pending results, returning what we have 13531 1726882463.48916: results queue empty 13531 1726882463.48917: checking for any_errors_fatal 13531 1726882463.48924: done checking for any_errors_fatal 13531 1726882463.48925: checking for max_fail_percentage 13531 1726882463.48927: done checking for max_fail_percentage 13531 1726882463.48928: checking to see if all hosts have failed and the running result is not ok 13531 1726882463.48928: done checking to see if all hosts have failed 13531 1726882463.48929: getting the remaining hosts for this loop 13531 1726882463.48930: done getting the remaining hosts for this loop 13531 1726882463.48933: getting the next task for host managed_node2 13531 1726882463.48940: done getting next task for host managed_node2 13531 1726882463.48944: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13531 1726882463.48947: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882463.48972: getting variables 13531 1726882463.48974: in VariableManager get_vars() 13531 1726882463.49028: Calling all_inventory to load vars for managed_node2 13531 1726882463.49030: Calling groups_inventory to load vars for managed_node2 13531 1726882463.49033: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882463.49049: Calling all_plugins_play to load vars for managed_node2 13531 1726882463.49053: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882463.49058: Calling groups_plugins_play to load vars for managed_node2 13531 1726882463.51249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882463.53303: done with get_vars() 13531 1726882463.53334: done getting variables 13531 1726882463.53404: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:34:23 -0400 (0:00:00.136) 0:00:51.429 ****** 13531 1726882463.53438: entering _queue_task() for managed_node2/service 13531 1726882463.53783: worker is 1 (out of 1 available) 13531 1726882463.53799: exiting _queue_task() for managed_node2/service 13531 1726882463.53815: done queuing things up, now waiting for results queue to drain 13531 1726882463.53816: waiting for pending results... 13531 1726882463.54161: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13531 1726882463.54316: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000178 13531 1726882463.54330: variable 'ansible_search_path' from source: unknown 13531 1726882463.54334: variable 'ansible_search_path' from source: unknown 13531 1726882463.54377: calling self._execute() 13531 1726882463.54491: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882463.54502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882463.54512: variable 'omit' from source: magic vars 13531 1726882463.54965: variable 'ansible_distribution_major_version' from source: facts 13531 1726882463.54978: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882463.55189: variable 'network_provider' from source: set_fact 13531 1726882463.55192: variable 'network_state' from source: role '' defaults 13531 1726882463.55204: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13531 1726882463.55211: variable 'omit' from source: magic vars 13531 1726882463.55286: variable 'omit' from source: magic vars 13531 1726882463.55314: variable 'network_service_name' from source: role '' defaults 13531 1726882463.55389: variable 'network_service_name' from source: role '' defaults 13531 1726882463.55509: variable '__network_provider_setup' from source: role '' defaults 13531 1726882463.55513: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882463.55582: variable '__network_service_name_default_nm' from source: role '' defaults 13531 1726882463.55594: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882463.55660: variable '__network_packages_default_nm' from source: role '' defaults 13531 1726882463.55910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882463.58724: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882463.58801: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882463.58949: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882463.58986: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882463.59012: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882463.59208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.59236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.59386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.59424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.59440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.59602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.59624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.59649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.60268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.60272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.60389: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13531 1726882463.60567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.60625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.60645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.60691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.60703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.60815: variable 'ansible_python' from source: facts 13531 1726882463.60832: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13531 1726882463.60921: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882463.61012: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882463.61150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.61174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.61203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.61249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.61266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.61318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882463.61346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882463.61371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.61409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882463.61431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882463.61584: variable 'network_connections' from source: task vars 13531 1726882463.61592: variable 'controller_profile' from source: play vars 13531 1726882463.61677: variable 'controller_profile' from source: play vars 13531 1726882463.61793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882463.62014: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882463.62066: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882463.62117: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882463.62159: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882463.62228: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882463.62260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882463.62299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882463.62335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882463.62385: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882463.62744: variable 'network_connections' from source: task vars 13531 1726882463.62748: variable 'controller_profile' from source: play vars 13531 1726882463.62834: variable 'controller_profile' from source: play vars 13531 1726882463.63131: variable '__network_packages_default_wireless' from source: role '' defaults 13531 1726882463.63135: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882463.63408: variable 'network_connections' from source: task vars 13531 1726882463.63412: variable 'controller_profile' from source: play vars 13531 1726882463.63414: variable 'controller_profile' from source: play vars 13531 1726882463.63416: variable '__network_packages_default_team' from source: role '' defaults 13531 1726882463.63972: variable '__network_team_connections_defined' from source: role '' defaults 13531 1726882463.63975: variable 'network_connections' from source: task vars 13531 1726882463.63978: variable 'controller_profile' from source: play vars 13531 1726882463.63980: variable 'controller_profile' from source: play vars 13531 1726882463.63984: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882463.64002: variable '__network_service_name_default_initscripts' from source: role '' defaults 13531 1726882463.64009: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882463.64097: variable '__network_packages_default_initscripts' from source: role '' defaults 13531 1726882463.64321: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13531 1726882463.65059: variable 'network_connections' from source: task vars 13531 1726882463.65062: variable 'controller_profile' from source: play vars 13531 1726882463.65253: variable 'controller_profile' from source: play vars 13531 1726882463.65259: variable 'ansible_distribution' from source: facts 13531 1726882463.65266: variable '__network_rh_distros' from source: role '' defaults 13531 1726882463.65272: variable 'ansible_distribution_major_version' from source: facts 13531 1726882463.65286: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13531 1726882463.65693: variable 'ansible_distribution' from source: facts 13531 1726882463.65697: variable '__network_rh_distros' from source: role '' defaults 13531 1726882463.65703: variable 'ansible_distribution_major_version' from source: facts 13531 1726882463.65719: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13531 1726882463.66159: variable 'ansible_distribution' from source: facts 13531 1726882463.66162: variable '__network_rh_distros' from source: role '' defaults 13531 1726882463.66167: variable 'ansible_distribution_major_version' from source: facts 13531 1726882463.66205: variable 'network_provider' from source: set_fact 13531 1726882463.66292: variable 'omit' from source: magic vars 13531 1726882463.66318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882463.66395: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882463.66414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882463.66547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882463.66565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882463.66631: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882463.66634: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882463.66637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882463.66842: Set connection var ansible_pipelining to False 13531 1726882463.66846: Set connection var ansible_timeout to 10 13531 1726882463.66851: Set connection var ansible_shell_executable to /bin/sh 13531 1726882463.66858: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882463.66861: Set connection var ansible_connection to ssh 13531 1726882463.66865: Set connection var ansible_shell_type to sh 13531 1726882463.66900: variable 'ansible_shell_executable' from source: unknown 13531 1726882463.66903: variable 'ansible_connection' from source: unknown 13531 1726882463.66906: variable 'ansible_module_compression' from source: unknown 13531 1726882463.66908: variable 'ansible_shell_type' from source: unknown 13531 1726882463.66911: variable 'ansible_shell_executable' from source: unknown 13531 1726882463.66913: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882463.66917: variable 'ansible_pipelining' from source: unknown 13531 1726882463.66919: variable 'ansible_timeout' from source: unknown 13531 1726882463.66923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882463.67045: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882463.67058: variable 'omit' from source: magic vars 13531 1726882463.67063: starting attempt loop 13531 1726882463.67074: running the handler 13531 1726882463.67159: variable 'ansible_facts' from source: unknown 13531 1726882463.67937: _low_level_execute_command(): starting 13531 1726882463.67942: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882463.68445: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882463.68465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882463.68479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882463.68490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882463.68538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882463.68550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882463.68672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882463.70326: stdout chunk (state=3): >>>/root <<< 13531 1726882463.70512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882463.70515: stdout chunk (state=3): >>><<< 13531 1726882463.70518: stderr chunk (state=3): >>><<< 13531 1726882463.70641: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882463.70645: _low_level_execute_command(): starting 13531 1726882463.70647: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882463.7054508-15794-12606855483983 `" && echo ansible-tmp-1726882463.7054508-15794-12606855483983="` echo /root/.ansible/tmp/ansible-tmp-1726882463.7054508-15794-12606855483983 `" ) && sleep 0' 13531 1726882463.71229: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882463.71242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882463.71260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882463.71280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882463.71330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882463.71368: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882463.71371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 13531 1726882463.71374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882463.71376: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882463.71430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882463.71447: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882463.71574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882463.73456: stdout chunk (state=3): >>>ansible-tmp-1726882463.7054508-15794-12606855483983=/root/.ansible/tmp/ansible-tmp-1726882463.7054508-15794-12606855483983 <<< 13531 1726882463.73562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882463.73620: stderr chunk (state=3): >>><<< 13531 1726882463.73623: stdout chunk (state=3): >>><<< 13531 1726882463.73639: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882463.7054508-15794-12606855483983=/root/.ansible/tmp/ansible-tmp-1726882463.7054508-15794-12606855483983 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882463.73675: variable 'ansible_module_compression' from source: unknown 13531 1726882463.73717: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13531 1726882463.73770: variable 'ansible_facts' from source: unknown 13531 1726882463.73906: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882463.7054508-15794-12606855483983/AnsiballZ_systemd.py 13531 1726882463.74024: Sending initial data 13531 1726882463.74028: Sent initial data (155 bytes) 13531 1726882463.74887: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882463.74896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882463.75023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882463.75121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882463.76847: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882463.76946: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882463.77046: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmptkdlwsy4 /root/.ansible/tmp/ansible-tmp-1726882463.7054508-15794-12606855483983/AnsiballZ_systemd.py <<< 13531 1726882463.77145: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882463.79470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882463.79473: stdout chunk (state=3): >>><<< 13531 1726882463.79477: stderr chunk (state=3): >>><<< 13531 1726882463.79479: done transferring module to remote 13531 1726882463.79481: _low_level_execute_command(): starting 13531 1726882463.79483: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882463.7054508-15794-12606855483983/ /root/.ansible/tmp/ansible-tmp-1726882463.7054508-15794-12606855483983/AnsiballZ_systemd.py && sleep 0' 13531 1726882463.80140: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882463.80162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882463.80178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882463.80200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882463.80243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882463.80265: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882463.80281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882463.80303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882463.80314: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882463.80323: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882463.80337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882463.80351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882463.80377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882463.80392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882463.80402: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882463.80419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882463.80507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882463.80531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882463.80548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882463.80680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882463.82460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882463.82503: stderr chunk (state=3): >>><<< 13531 1726882463.82506: stdout chunk (state=3): >>><<< 13531 1726882463.82520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882463.82523: _low_level_execute_command(): starting 13531 1726882463.82527: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882463.7054508-15794-12606855483983/AnsiballZ_systemd.py && sleep 0' 13531 1726882463.82958: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882463.82961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882463.83004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882463.83009: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882463.83011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882463.83051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882463.83055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882463.83178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882464.08255: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9228288", "MemoryAvailable": "infinity", "CPUUsageNSec": "1253471000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft"<<< 13531 1726882464.08267: stdout chunk (state=3): >>>: "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13531 1726882464.09769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882464.09829: stderr chunk (state=3): >>><<< 13531 1726882464.09832: stdout chunk (state=3): >>><<< 13531 1726882464.09849: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9228288", "MemoryAvailable": "infinity", "CPUUsageNSec": "1253471000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882464.09962: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882463.7054508-15794-12606855483983/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882464.09980: _low_level_execute_command(): starting 13531 1726882464.09983: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882463.7054508-15794-12606855483983/ > /dev/null 2>&1 && sleep 0' 13531 1726882464.10454: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882464.10465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882464.10487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882464.10499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882464.10508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882464.10555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882464.10574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882464.10686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882464.12495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882464.12547: stderr chunk (state=3): >>><<< 13531 1726882464.12551: stdout chunk (state=3): >>><<< 13531 1726882464.12566: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882464.12572: handler run complete 13531 1726882464.12614: attempt loop complete, returning result 13531 1726882464.12617: _execute() done 13531 1726882464.12620: dumping result to json 13531 1726882464.12630: done dumping result, returning 13531 1726882464.12639: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4fd9-519d-000000000178] 13531 1726882464.12645: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000178 13531 1726882464.12869: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000178 13531 1726882464.12872: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882464.12930: no more pending results, returning what we have 13531 1726882464.12933: results queue empty 13531 1726882464.12934: checking for any_errors_fatal 13531 1726882464.12940: done checking for any_errors_fatal 13531 1726882464.12940: checking for max_fail_percentage 13531 1726882464.12942: done checking for max_fail_percentage 13531 1726882464.12943: checking to see if all hosts have failed and the running result is not ok 13531 1726882464.12944: done checking to see if all hosts have failed 13531 1726882464.12944: getting the remaining hosts for this loop 13531 1726882464.12946: done getting the remaining hosts for this loop 13531 1726882464.12949: getting the next task for host managed_node2 13531 1726882464.12957: done getting next task for host managed_node2 13531 1726882464.12961: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13531 1726882464.12967: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882464.12979: getting variables 13531 1726882464.12982: in VariableManager get_vars() 13531 1726882464.13025: Calling all_inventory to load vars for managed_node2 13531 1726882464.13027: Calling groups_inventory to load vars for managed_node2 13531 1726882464.13030: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882464.13039: Calling all_plugins_play to load vars for managed_node2 13531 1726882464.13042: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882464.13044: Calling groups_plugins_play to load vars for managed_node2 13531 1726882464.13884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882464.14941: done with get_vars() 13531 1726882464.14960: done getting variables 13531 1726882464.15004: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:34:24 -0400 (0:00:00.615) 0:00:52.045 ****** 13531 1726882464.15029: entering _queue_task() for managed_node2/service 13531 1726882464.15250: worker is 1 (out of 1 available) 13531 1726882464.15267: exiting _queue_task() for managed_node2/service 13531 1726882464.15279: done queuing things up, now waiting for results queue to drain 13531 1726882464.15281: waiting for pending results... 13531 1726882464.15459: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13531 1726882464.15559: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000179 13531 1726882464.15574: variable 'ansible_search_path' from source: unknown 13531 1726882464.15578: variable 'ansible_search_path' from source: unknown 13531 1726882464.15608: calling self._execute() 13531 1726882464.15687: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882464.15692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882464.15702: variable 'omit' from source: magic vars 13531 1726882464.15984: variable 'ansible_distribution_major_version' from source: facts 13531 1726882464.15994: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882464.16078: variable 'network_provider' from source: set_fact 13531 1726882464.16082: Evaluated conditional (network_provider == "nm"): True 13531 1726882464.16145: variable '__network_wpa_supplicant_required' from source: role '' defaults 13531 1726882464.16209: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13531 1726882464.16331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882464.17840: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882464.17887: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882464.17914: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882464.17940: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882464.17960: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882464.18028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882464.18047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882464.18068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882464.18097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882464.18107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882464.18139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882464.18158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882464.18175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882464.18201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882464.18212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882464.18241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882464.18259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882464.18277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882464.18300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882464.18312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882464.18406: variable 'network_connections' from source: task vars 13531 1726882464.18415: variable 'controller_profile' from source: play vars 13531 1726882464.18462: variable 'controller_profile' from source: play vars 13531 1726882464.18512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13531 1726882464.18617: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13531 1726882464.18642: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13531 1726882464.18674: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13531 1726882464.18695: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13531 1726882464.18725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13531 1726882464.18740: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13531 1726882464.18764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882464.18784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13531 1726882464.18820: variable '__network_wireless_connections_defined' from source: role '' defaults 13531 1726882464.18974: variable 'network_connections' from source: task vars 13531 1726882464.18979: variable 'controller_profile' from source: play vars 13531 1726882464.19020: variable 'controller_profile' from source: play vars 13531 1726882464.19041: Evaluated conditional (__network_wpa_supplicant_required): False 13531 1726882464.19045: when evaluation is False, skipping this task 13531 1726882464.19047: _execute() done 13531 1726882464.19049: dumping result to json 13531 1726882464.19052: done dumping result, returning 13531 1726882464.19061: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4fd9-519d-000000000179] 13531 1726882464.19080: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000179 13531 1726882464.19152: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000179 13531 1726882464.19155: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13531 1726882464.19206: no more pending results, returning what we have 13531 1726882464.19209: results queue empty 13531 1726882464.19210: checking for any_errors_fatal 13531 1726882464.19230: done checking for any_errors_fatal 13531 1726882464.19231: checking for max_fail_percentage 13531 1726882464.19233: done checking for max_fail_percentage 13531 1726882464.19233: checking to see if all hosts have failed and the running result is not ok 13531 1726882464.19234: done checking to see if all hosts have failed 13531 1726882464.19235: getting the remaining hosts for this loop 13531 1726882464.19236: done getting the remaining hosts for this loop 13531 1726882464.19239: getting the next task for host managed_node2 13531 1726882464.19245: done getting next task for host managed_node2 13531 1726882464.19249: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13531 1726882464.19252: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882464.19272: getting variables 13531 1726882464.19274: in VariableManager get_vars() 13531 1726882464.19327: Calling all_inventory to load vars for managed_node2 13531 1726882464.19329: Calling groups_inventory to load vars for managed_node2 13531 1726882464.19331: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882464.19340: Calling all_plugins_play to load vars for managed_node2 13531 1726882464.19343: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882464.19345: Calling groups_plugins_play to load vars for managed_node2 13531 1726882464.20148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882464.21098: done with get_vars() 13531 1726882464.21113: done getting variables 13531 1726882464.21155: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:34:24 -0400 (0:00:00.061) 0:00:52.107 ****** 13531 1726882464.21179: entering _queue_task() for managed_node2/service 13531 1726882464.21376: worker is 1 (out of 1 available) 13531 1726882464.21390: exiting _queue_task() for managed_node2/service 13531 1726882464.21402: done queuing things up, now waiting for results queue to drain 13531 1726882464.21404: waiting for pending results... 13531 1726882464.21582: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 13531 1726882464.21688: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000017a 13531 1726882464.21703: variable 'ansible_search_path' from source: unknown 13531 1726882464.21707: variable 'ansible_search_path' from source: unknown 13531 1726882464.21738: calling self._execute() 13531 1726882464.21815: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882464.21819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882464.21827: variable 'omit' from source: magic vars 13531 1726882464.22097: variable 'ansible_distribution_major_version' from source: facts 13531 1726882464.22107: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882464.22188: variable 'network_provider' from source: set_fact 13531 1726882464.22192: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882464.22196: when evaluation is False, skipping this task 13531 1726882464.22198: _execute() done 13531 1726882464.22201: dumping result to json 13531 1726882464.22204: done dumping result, returning 13531 1726882464.22211: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4fd9-519d-00000000017a] 13531 1726882464.22217: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000017a 13531 1726882464.22308: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000017a 13531 1726882464.22311: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13531 1726882464.22361: no more pending results, returning what we have 13531 1726882464.22365: results queue empty 13531 1726882464.22366: checking for any_errors_fatal 13531 1726882464.22371: done checking for any_errors_fatal 13531 1726882464.22372: checking for max_fail_percentage 13531 1726882464.22373: done checking for max_fail_percentage 13531 1726882464.22374: checking to see if all hosts have failed and the running result is not ok 13531 1726882464.22375: done checking to see if all hosts have failed 13531 1726882464.22375: getting the remaining hosts for this loop 13531 1726882464.22377: done getting the remaining hosts for this loop 13531 1726882464.22379: getting the next task for host managed_node2 13531 1726882464.22384: done getting next task for host managed_node2 13531 1726882464.22387: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13531 1726882464.22391: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882464.22408: getting variables 13531 1726882464.22410: in VariableManager get_vars() 13531 1726882464.22460: Calling all_inventory to load vars for managed_node2 13531 1726882464.22462: Calling groups_inventory to load vars for managed_node2 13531 1726882464.22466: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882464.22473: Calling all_plugins_play to load vars for managed_node2 13531 1726882464.22474: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882464.22476: Calling groups_plugins_play to load vars for managed_node2 13531 1726882464.24139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882464.26003: done with get_vars() 13531 1726882464.26033: done getting variables 13531 1726882464.26100: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:34:24 -0400 (0:00:00.049) 0:00:52.156 ****** 13531 1726882464.26140: entering _queue_task() for managed_node2/copy 13531 1726882464.26469: worker is 1 (out of 1 available) 13531 1726882464.26487: exiting _queue_task() for managed_node2/copy 13531 1726882464.26500: done queuing things up, now waiting for results queue to drain 13531 1726882464.26501: waiting for pending results... 13531 1726882464.26826: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13531 1726882464.26995: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000017b 13531 1726882464.27021: variable 'ansible_search_path' from source: unknown 13531 1726882464.27033: variable 'ansible_search_path' from source: unknown 13531 1726882464.27084: calling self._execute() 13531 1726882464.27196: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882464.27208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882464.27223: variable 'omit' from source: magic vars 13531 1726882464.27651: variable 'ansible_distribution_major_version' from source: facts 13531 1726882464.27679: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882464.27822: variable 'network_provider' from source: set_fact 13531 1726882464.27833: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882464.27841: when evaluation is False, skipping this task 13531 1726882464.27849: _execute() done 13531 1726882464.27859: dumping result to json 13531 1726882464.27870: done dumping result, returning 13531 1726882464.27882: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4fd9-519d-00000000017b] 13531 1726882464.27905: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000017b skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13531 1726882464.28076: no more pending results, returning what we have 13531 1726882464.28080: results queue empty 13531 1726882464.28081: checking for any_errors_fatal 13531 1726882464.28088: done checking for any_errors_fatal 13531 1726882464.28089: checking for max_fail_percentage 13531 1726882464.28091: done checking for max_fail_percentage 13531 1726882464.28092: checking to see if all hosts have failed and the running result is not ok 13531 1726882464.28093: done checking to see if all hosts have failed 13531 1726882464.28094: getting the remaining hosts for this loop 13531 1726882464.28095: done getting the remaining hosts for this loop 13531 1726882464.28099: getting the next task for host managed_node2 13531 1726882464.28106: done getting next task for host managed_node2 13531 1726882464.28111: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13531 1726882464.28115: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882464.28140: getting variables 13531 1726882464.28142: in VariableManager get_vars() 13531 1726882464.28207: Calling all_inventory to load vars for managed_node2 13531 1726882464.28210: Calling groups_inventory to load vars for managed_node2 13531 1726882464.28212: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882464.28226: Calling all_plugins_play to load vars for managed_node2 13531 1726882464.28229: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882464.28232: Calling groups_plugins_play to load vars for managed_node2 13531 1726882464.29231: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000017b 13531 1726882464.29234: WORKER PROCESS EXITING 13531 1726882464.30087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882464.31993: done with get_vars() 13531 1726882464.32016: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:34:24 -0400 (0:00:00.059) 0:00:52.216 ****** 13531 1726882464.32113: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 13531 1726882464.32398: worker is 1 (out of 1 available) 13531 1726882464.32411: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 13531 1726882464.32430: done queuing things up, now waiting for results queue to drain 13531 1726882464.32432: waiting for pending results... 13531 1726882464.32735: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13531 1726882464.32899: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000017c 13531 1726882464.32921: variable 'ansible_search_path' from source: unknown 13531 1726882464.32928: variable 'ansible_search_path' from source: unknown 13531 1726882464.32977: calling self._execute() 13531 1726882464.33089: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882464.33103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882464.33118: variable 'omit' from source: magic vars 13531 1726882464.33540: variable 'ansible_distribution_major_version' from source: facts 13531 1726882464.33561: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882464.33576: variable 'omit' from source: magic vars 13531 1726882464.33665: variable 'omit' from source: magic vars 13531 1726882464.33835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13531 1726882464.36637: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13531 1726882464.36719: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13531 1726882464.36766: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13531 1726882464.36816: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13531 1726882464.36848: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13531 1726882464.36938: variable 'network_provider' from source: set_fact 13531 1726882464.37085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13531 1726882464.37128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13531 1726882464.37165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13531 1726882464.37212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13531 1726882464.37243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13531 1726882464.37323: variable 'omit' from source: magic vars 13531 1726882464.37461: variable 'omit' from source: magic vars 13531 1726882464.37579: variable 'network_connections' from source: task vars 13531 1726882464.37594: variable 'controller_profile' from source: play vars 13531 1726882464.37668: variable 'controller_profile' from source: play vars 13531 1726882464.37827: variable 'omit' from source: magic vars 13531 1726882464.37841: variable '__lsr_ansible_managed' from source: task vars 13531 1726882464.37921: variable '__lsr_ansible_managed' from source: task vars 13531 1726882464.38137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13531 1726882464.38379: Loaded config def from plugin (lookup/template) 13531 1726882464.38389: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13531 1726882464.38421: File lookup term: get_ansible_managed.j2 13531 1726882464.38437: variable 'ansible_search_path' from source: unknown 13531 1726882464.38446: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13531 1726882464.38467: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13531 1726882464.38490: variable 'ansible_search_path' from source: unknown 13531 1726882464.45524: variable 'ansible_managed' from source: unknown 13531 1726882464.45681: variable 'omit' from source: magic vars 13531 1726882464.45723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882464.45753: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882464.45781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882464.45813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882464.45829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882464.45866: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882464.45876: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882464.45885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882464.45998: Set connection var ansible_pipelining to False 13531 1726882464.46009: Set connection var ansible_timeout to 10 13531 1726882464.46029: Set connection var ansible_shell_executable to /bin/sh 13531 1726882464.46040: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882464.46046: Set connection var ansible_connection to ssh 13531 1726882464.46052: Set connection var ansible_shell_type to sh 13531 1726882464.46087: variable 'ansible_shell_executable' from source: unknown 13531 1726882464.46096: variable 'ansible_connection' from source: unknown 13531 1726882464.46103: variable 'ansible_module_compression' from source: unknown 13531 1726882464.46110: variable 'ansible_shell_type' from source: unknown 13531 1726882464.46117: variable 'ansible_shell_executable' from source: unknown 13531 1726882464.46134: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882464.46145: variable 'ansible_pipelining' from source: unknown 13531 1726882464.46153: variable 'ansible_timeout' from source: unknown 13531 1726882464.46167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882464.46312: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882464.46338: variable 'omit' from source: magic vars 13531 1726882464.46362: starting attempt loop 13531 1726882464.46372: running the handler 13531 1726882464.46390: _low_level_execute_command(): starting 13531 1726882464.46401: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882464.47200: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882464.47216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882464.47240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882464.47262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882464.47307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882464.47320: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882464.47342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882464.47366: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882464.47380: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882464.47392: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882464.47404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882464.47419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882464.47436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882464.47460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882464.47481: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882464.47496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882464.47585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882464.47602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882464.47616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882464.47756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882464.49420: stdout chunk (state=3): >>>/root <<< 13531 1726882464.49519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882464.49586: stderr chunk (state=3): >>><<< 13531 1726882464.49589: stdout chunk (state=3): >>><<< 13531 1726882464.49612: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882464.49624: _low_level_execute_command(): starting 13531 1726882464.49629: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882464.4961154-15842-181147453128498 `" && echo ansible-tmp-1726882464.4961154-15842-181147453128498="` echo /root/.ansible/tmp/ansible-tmp-1726882464.4961154-15842-181147453128498 `" ) && sleep 0' 13531 1726882464.50234: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882464.50242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882464.50252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882464.50267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882464.50304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882464.50310: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882464.50319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882464.50332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882464.50339: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882464.50345: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882464.50353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882464.50362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882464.50379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882464.50386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882464.50393: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882464.50403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882464.50487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882464.50501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882464.50515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882464.50655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882464.52525: stdout chunk (state=3): >>>ansible-tmp-1726882464.4961154-15842-181147453128498=/root/.ansible/tmp/ansible-tmp-1726882464.4961154-15842-181147453128498 <<< 13531 1726882464.52711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882464.52884: stderr chunk (state=3): >>><<< 13531 1726882464.52888: stdout chunk (state=3): >>><<< 13531 1726882464.53072: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882464.4961154-15842-181147453128498=/root/.ansible/tmp/ansible-tmp-1726882464.4961154-15842-181147453128498 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882464.53081: variable 'ansible_module_compression' from source: unknown 13531 1726882464.53084: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13531 1726882464.53086: variable 'ansible_facts' from source: unknown 13531 1726882464.53134: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882464.4961154-15842-181147453128498/AnsiballZ_network_connections.py 13531 1726882464.53280: Sending initial data 13531 1726882464.53284: Sent initial data (168 bytes) 13531 1726882464.54176: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882464.54190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882464.54206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882464.54224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882464.54267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882464.54280: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882464.54295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882464.54313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882464.54324: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882464.54335: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882464.54346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882464.54359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882464.54378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882464.54390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882464.54401: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882464.54414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882464.54493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882464.54514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882464.54531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882464.54659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882464.56424: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882464.56520: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882464.56621: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmp0b37wsl4 /root/.ansible/tmp/ansible-tmp-1726882464.4961154-15842-181147453128498/AnsiballZ_network_connections.py <<< 13531 1726882464.56719: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882464.58802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882464.58869: stderr chunk (state=3): >>><<< 13531 1726882464.58872: stdout chunk (state=3): >>><<< 13531 1726882464.58874: done transferring module to remote 13531 1726882464.58877: _low_level_execute_command(): starting 13531 1726882464.58880: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882464.4961154-15842-181147453128498/ /root/.ansible/tmp/ansible-tmp-1726882464.4961154-15842-181147453128498/AnsiballZ_network_connections.py && sleep 0' 13531 1726882464.59510: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882464.59525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882464.59539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882464.59558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882464.59607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882464.59622: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882464.59638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882464.59656: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882464.59672: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882464.59684: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882464.59696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882464.59714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882464.59734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882464.59748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882464.59760: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882464.59776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882464.59857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882464.59877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882464.59892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882464.60019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882464.61848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882464.61852: stdout chunk (state=3): >>><<< 13531 1726882464.61860: stderr chunk (state=3): >>><<< 13531 1726882464.61879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882464.61887: _low_level_execute_command(): starting 13531 1726882464.61890: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882464.4961154-15842-181147453128498/AnsiballZ_network_connections.py && sleep 0' 13531 1726882464.65541: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882464.65556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882464.65573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882464.65590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882464.65750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882464.65762: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882464.65779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882464.65795: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882464.65806: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882464.65816: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882464.65827: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882464.65846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882464.65862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882464.65876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882464.65886: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882464.65899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882464.65986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882464.66079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882464.66093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882464.66298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882465.00410: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__7kcsm6n/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 13531 1726882465.00415: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__7kcsm6n/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/467176f0-8c25-4dd1-9498-f31f30164a10: error=unknown <<< 13531 1726882465.00595: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13531 1726882465.02181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882465.02251: stderr chunk (state=3): >>><<< 13531 1726882465.02255: stdout chunk (state=3): >>><<< 13531 1726882465.02371: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__7kcsm6n/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__7kcsm6n/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/467176f0-8c25-4dd1-9498-f31f30164a10: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882465.02374: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'down', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882464.4961154-15842-181147453128498/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882465.02377: _low_level_execute_command(): starting 13531 1726882465.02379: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882464.4961154-15842-181147453128498/ > /dev/null 2>&1 && sleep 0' 13531 1726882465.02924: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882465.02928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.02932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.02934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.02990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.02993: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882465.02996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.03015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882465.03018: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882465.03020: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882465.03022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.03088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.03091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.03093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.03095: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882465.03097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.03167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882465.03171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882465.03173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882465.03290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882465.05133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882465.05181: stderr chunk (state=3): >>><<< 13531 1726882465.05183: stdout chunk (state=3): >>><<< 13531 1726882465.05204: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882465.05227: handler run complete 13531 1726882465.05245: attempt loop complete, returning result 13531 1726882465.05248: _execute() done 13531 1726882465.05251: dumping result to json 13531 1726882465.05260: done dumping result, returning 13531 1726882465.05272: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4fd9-519d-00000000017c] 13531 1726882465.05277: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000017c 13531 1726882465.05405: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000017c 13531 1726882465.05409: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13531 1726882465.05509: no more pending results, returning what we have 13531 1726882465.05512: results queue empty 13531 1726882465.05514: checking for any_errors_fatal 13531 1726882465.05522: done checking for any_errors_fatal 13531 1726882465.05522: checking for max_fail_percentage 13531 1726882465.05524: done checking for max_fail_percentage 13531 1726882465.05524: checking to see if all hosts have failed and the running result is not ok 13531 1726882465.05525: done checking to see if all hosts have failed 13531 1726882465.05526: getting the remaining hosts for this loop 13531 1726882465.05527: done getting the remaining hosts for this loop 13531 1726882465.05530: getting the next task for host managed_node2 13531 1726882465.05536: done getting next task for host managed_node2 13531 1726882465.05540: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13531 1726882465.05544: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882465.05555: getting variables 13531 1726882465.05557: in VariableManager get_vars() 13531 1726882465.05607: Calling all_inventory to load vars for managed_node2 13531 1726882465.05609: Calling groups_inventory to load vars for managed_node2 13531 1726882465.05612: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882465.05622: Calling all_plugins_play to load vars for managed_node2 13531 1726882465.05624: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882465.05626: Calling groups_plugins_play to load vars for managed_node2 13531 1726882465.07771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882465.09807: done with get_vars() 13531 1726882465.09842: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:34:25 -0400 (0:00:00.778) 0:00:52.994 ****** 13531 1726882465.09932: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 13531 1726882465.10286: worker is 1 (out of 1 available) 13531 1726882465.10299: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 13531 1726882465.10313: done queuing things up, now waiting for results queue to drain 13531 1726882465.10314: waiting for pending results... 13531 1726882465.10654: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 13531 1726882465.10848: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000017d 13531 1726882465.10872: variable 'ansible_search_path' from source: unknown 13531 1726882465.10884: variable 'ansible_search_path' from source: unknown 13531 1726882465.10941: calling self._execute() 13531 1726882465.11069: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882465.11083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882465.11101: variable 'omit' from source: magic vars 13531 1726882465.11533: variable 'ansible_distribution_major_version' from source: facts 13531 1726882465.11551: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882465.11682: variable 'network_state' from source: role '' defaults 13531 1726882465.11705: Evaluated conditional (network_state != {}): False 13531 1726882465.11712: when evaluation is False, skipping this task 13531 1726882465.11726: _execute() done 13531 1726882465.11734: dumping result to json 13531 1726882465.11749: done dumping result, returning 13531 1726882465.11766: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4fd9-519d-00000000017d] 13531 1726882465.11802: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000017d skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13531 1726882465.12092: no more pending results, returning what we have 13531 1726882465.12097: results queue empty 13531 1726882465.12098: checking for any_errors_fatal 13531 1726882465.12111: done checking for any_errors_fatal 13531 1726882465.12112: checking for max_fail_percentage 13531 1726882465.12114: done checking for max_fail_percentage 13531 1726882465.12115: checking to see if all hosts have failed and the running result is not ok 13531 1726882465.12116: done checking to see if all hosts have failed 13531 1726882465.12116: getting the remaining hosts for this loop 13531 1726882465.12118: done getting the remaining hosts for this loop 13531 1726882465.12121: getting the next task for host managed_node2 13531 1726882465.12129: done getting next task for host managed_node2 13531 1726882465.12135: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13531 1726882465.12139: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882465.12163: getting variables 13531 1726882465.12167: in VariableManager get_vars() 13531 1726882465.12228: Calling all_inventory to load vars for managed_node2 13531 1726882465.12232: Calling groups_inventory to load vars for managed_node2 13531 1726882465.12235: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882465.12249: Calling all_plugins_play to load vars for managed_node2 13531 1726882465.12252: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882465.12255: Calling groups_plugins_play to load vars for managed_node2 13531 1726882465.13258: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000017d 13531 1726882465.13262: WORKER PROCESS EXITING 13531 1726882465.14225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882465.16199: done with get_vars() 13531 1726882465.16223: done getting variables 13531 1726882465.16290: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:34:25 -0400 (0:00:00.063) 0:00:53.058 ****** 13531 1726882465.16324: entering _queue_task() for managed_node2/debug 13531 1726882465.16656: worker is 1 (out of 1 available) 13531 1726882465.16671: exiting _queue_task() for managed_node2/debug 13531 1726882465.16686: done queuing things up, now waiting for results queue to drain 13531 1726882465.16687: waiting for pending results... 13531 1726882465.16995: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13531 1726882465.17165: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000017e 13531 1726882465.17186: variable 'ansible_search_path' from source: unknown 13531 1726882465.17195: variable 'ansible_search_path' from source: unknown 13531 1726882465.17240: calling self._execute() 13531 1726882465.17345: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882465.17362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882465.17386: variable 'omit' from source: magic vars 13531 1726882465.17788: variable 'ansible_distribution_major_version' from source: facts 13531 1726882465.17813: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882465.17826: variable 'omit' from source: magic vars 13531 1726882465.17909: variable 'omit' from source: magic vars 13531 1726882465.17955: variable 'omit' from source: magic vars 13531 1726882465.18004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882465.18052: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882465.18079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882465.18102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882465.18122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882465.18165: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882465.18174: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882465.18183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882465.18315: Set connection var ansible_pipelining to False 13531 1726882465.18327: Set connection var ansible_timeout to 10 13531 1726882465.18345: Set connection var ansible_shell_executable to /bin/sh 13531 1726882465.18361: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882465.18372: Set connection var ansible_connection to ssh 13531 1726882465.18379: Set connection var ansible_shell_type to sh 13531 1726882465.18410: variable 'ansible_shell_executable' from source: unknown 13531 1726882465.18420: variable 'ansible_connection' from source: unknown 13531 1726882465.18428: variable 'ansible_module_compression' from source: unknown 13531 1726882465.18435: variable 'ansible_shell_type' from source: unknown 13531 1726882465.18446: variable 'ansible_shell_executable' from source: unknown 13531 1726882465.18455: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882465.18471: variable 'ansible_pipelining' from source: unknown 13531 1726882465.18479: variable 'ansible_timeout' from source: unknown 13531 1726882465.18488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882465.18635: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882465.18653: variable 'omit' from source: magic vars 13531 1726882465.18671: starting attempt loop 13531 1726882465.18685: running the handler 13531 1726882465.18824: variable '__network_connections_result' from source: set_fact 13531 1726882465.18883: handler run complete 13531 1726882465.18915: attempt loop complete, returning result 13531 1726882465.18922: _execute() done 13531 1726882465.18928: dumping result to json 13531 1726882465.18934: done dumping result, returning 13531 1726882465.18945: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4fd9-519d-00000000017e] 13531 1726882465.18956: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000017e 13531 1726882465.19077: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000017e 13531 1726882465.19084: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 13531 1726882465.19160: no more pending results, returning what we have 13531 1726882465.19165: results queue empty 13531 1726882465.19166: checking for any_errors_fatal 13531 1726882465.19173: done checking for any_errors_fatal 13531 1726882465.19174: checking for max_fail_percentage 13531 1726882465.19175: done checking for max_fail_percentage 13531 1726882465.19176: checking to see if all hosts have failed and the running result is not ok 13531 1726882465.19177: done checking to see if all hosts have failed 13531 1726882465.19178: getting the remaining hosts for this loop 13531 1726882465.19179: done getting the remaining hosts for this loop 13531 1726882465.19183: getting the next task for host managed_node2 13531 1726882465.19190: done getting next task for host managed_node2 13531 1726882465.19194: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13531 1726882465.19199: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882465.19211: getting variables 13531 1726882465.19214: in VariableManager get_vars() 13531 1726882465.19271: Calling all_inventory to load vars for managed_node2 13531 1726882465.19274: Calling groups_inventory to load vars for managed_node2 13531 1726882465.19277: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882465.19287: Calling all_plugins_play to load vars for managed_node2 13531 1726882465.19290: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882465.19293: Calling groups_plugins_play to load vars for managed_node2 13531 1726882465.21542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882465.23375: done with get_vars() 13531 1726882465.23410: done getting variables 13531 1726882465.23471: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:34:25 -0400 (0:00:00.071) 0:00:53.130 ****** 13531 1726882465.23513: entering _queue_task() for managed_node2/debug 13531 1726882465.23825: worker is 1 (out of 1 available) 13531 1726882465.23840: exiting _queue_task() for managed_node2/debug 13531 1726882465.23853: done queuing things up, now waiting for results queue to drain 13531 1726882465.23854: waiting for pending results... 13531 1726882465.24157: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13531 1726882465.24317: in run() - task 0e448fcc-3ce9-4fd9-519d-00000000017f 13531 1726882465.24336: variable 'ansible_search_path' from source: unknown 13531 1726882465.24343: variable 'ansible_search_path' from source: unknown 13531 1726882465.24393: calling self._execute() 13531 1726882465.24500: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882465.24517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882465.24532: variable 'omit' from source: magic vars 13531 1726882465.24943: variable 'ansible_distribution_major_version' from source: facts 13531 1726882465.24965: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882465.24977: variable 'omit' from source: magic vars 13531 1726882465.25056: variable 'omit' from source: magic vars 13531 1726882465.25099: variable 'omit' from source: magic vars 13531 1726882465.25155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882465.25199: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882465.25229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882465.25252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882465.25270: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882465.25306: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882465.25314: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882465.25320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882465.25502: Set connection var ansible_pipelining to False 13531 1726882465.25516: Set connection var ansible_timeout to 10 13531 1726882465.25530: Set connection var ansible_shell_executable to /bin/sh 13531 1726882465.25540: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882465.25551: Set connection var ansible_connection to ssh 13531 1726882465.25563: Set connection var ansible_shell_type to sh 13531 1726882465.25595: variable 'ansible_shell_executable' from source: unknown 13531 1726882465.25602: variable 'ansible_connection' from source: unknown 13531 1726882465.25608: variable 'ansible_module_compression' from source: unknown 13531 1726882465.25614: variable 'ansible_shell_type' from source: unknown 13531 1726882465.25619: variable 'ansible_shell_executable' from source: unknown 13531 1726882465.25626: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882465.25635: variable 'ansible_pipelining' from source: unknown 13531 1726882465.25641: variable 'ansible_timeout' from source: unknown 13531 1726882465.25648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882465.25806: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882465.25824: variable 'omit' from source: magic vars 13531 1726882465.25835: starting attempt loop 13531 1726882465.25841: running the handler 13531 1726882465.25933: variable '__network_connections_result' from source: set_fact 13531 1726882465.26045: variable '__network_connections_result' from source: set_fact 13531 1726882465.26223: handler run complete 13531 1726882465.26254: attempt loop complete, returning result 13531 1726882465.26262: _execute() done 13531 1726882465.26272: dumping result to json 13531 1726882465.26301: done dumping result, returning 13531 1726882465.26319: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4fd9-519d-00000000017f] 13531 1726882465.26335: sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000017f ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13531 1726882465.26574: no more pending results, returning what we have 13531 1726882465.26597: results queue empty 13531 1726882465.26599: checking for any_errors_fatal 13531 1726882465.26606: done checking for any_errors_fatal 13531 1726882465.26607: checking for max_fail_percentage 13531 1726882465.26609: done checking for max_fail_percentage 13531 1726882465.26610: checking to see if all hosts have failed and the running result is not ok 13531 1726882465.26611: done checking to see if all hosts have failed 13531 1726882465.26611: getting the remaining hosts for this loop 13531 1726882465.26613: done getting the remaining hosts for this loop 13531 1726882465.26616: getting the next task for host managed_node2 13531 1726882465.26624: done getting next task for host managed_node2 13531 1726882465.26627: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13531 1726882465.26631: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882465.26645: getting variables 13531 1726882465.26647: in VariableManager get_vars() 13531 1726882465.26727: Calling all_inventory to load vars for managed_node2 13531 1726882465.26730: Calling groups_inventory to load vars for managed_node2 13531 1726882465.26733: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882465.26743: Calling all_plugins_play to load vars for managed_node2 13531 1726882465.26746: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882465.26749: Calling groups_plugins_play to load vars for managed_node2 13531 1726882465.27771: done sending task result for task 0e448fcc-3ce9-4fd9-519d-00000000017f 13531 1726882465.27774: WORKER PROCESS EXITING 13531 1726882465.29305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882465.31158: done with get_vars() 13531 1726882465.31197: done getting variables 13531 1726882465.31254: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:34:25 -0400 (0:00:00.077) 0:00:53.208 ****** 13531 1726882465.31298: entering _queue_task() for managed_node2/debug 13531 1726882465.31596: worker is 1 (out of 1 available) 13531 1726882465.31612: exiting _queue_task() for managed_node2/debug 13531 1726882465.31630: done queuing things up, now waiting for results queue to drain 13531 1726882465.31632: waiting for pending results... 13531 1726882465.32596: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13531 1726882465.32792: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000180 13531 1726882465.32812: variable 'ansible_search_path' from source: unknown 13531 1726882465.32819: variable 'ansible_search_path' from source: unknown 13531 1726882465.32874: calling self._execute() 13531 1726882465.32983: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882465.32995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882465.33008: variable 'omit' from source: magic vars 13531 1726882465.33436: variable 'ansible_distribution_major_version' from source: facts 13531 1726882465.33454: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882465.33594: variable 'network_state' from source: role '' defaults 13531 1726882465.33632: Evaluated conditional (network_state != {}): False 13531 1726882465.33641: when evaluation is False, skipping this task 13531 1726882465.33649: _execute() done 13531 1726882465.33656: dumping result to json 13531 1726882465.33668: done dumping result, returning 13531 1726882465.33681: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4fd9-519d-000000000180] 13531 1726882465.33693: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000180 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 13531 1726882465.33862: no more pending results, returning what we have 13531 1726882465.33868: results queue empty 13531 1726882465.33869: checking for any_errors_fatal 13531 1726882465.33879: done checking for any_errors_fatal 13531 1726882465.33880: checking for max_fail_percentage 13531 1726882465.33882: done checking for max_fail_percentage 13531 1726882465.33883: checking to see if all hosts have failed and the running result is not ok 13531 1726882465.33884: done checking to see if all hosts have failed 13531 1726882465.33885: getting the remaining hosts for this loop 13531 1726882465.33886: done getting the remaining hosts for this loop 13531 1726882465.33890: getting the next task for host managed_node2 13531 1726882465.33897: done getting next task for host managed_node2 13531 1726882465.33901: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13531 1726882465.33906: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882465.33928: getting variables 13531 1726882465.33930: in VariableManager get_vars() 13531 1726882465.33988: Calling all_inventory to load vars for managed_node2 13531 1726882465.33991: Calling groups_inventory to load vars for managed_node2 13531 1726882465.33993: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882465.34006: Calling all_plugins_play to load vars for managed_node2 13531 1726882465.34009: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882465.34012: Calling groups_plugins_play to load vars for managed_node2 13531 1726882465.35018: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000180 13531 1726882465.35022: WORKER PROCESS EXITING 13531 1726882465.35951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882465.39190: done with get_vars() 13531 1726882465.39222: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:34:25 -0400 (0:00:00.081) 0:00:53.289 ****** 13531 1726882465.39433: entering _queue_task() for managed_node2/ping 13531 1726882465.39811: worker is 1 (out of 1 available) 13531 1726882465.39823: exiting _queue_task() for managed_node2/ping 13531 1726882465.39836: done queuing things up, now waiting for results queue to drain 13531 1726882465.39837: waiting for pending results... 13531 1726882465.41159: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 13531 1726882465.41305: in run() - task 0e448fcc-3ce9-4fd9-519d-000000000181 13531 1726882465.41318: variable 'ansible_search_path' from source: unknown 13531 1726882465.41322: variable 'ansible_search_path' from source: unknown 13531 1726882465.41362: calling self._execute() 13531 1726882465.42172: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882465.42180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882465.42289: variable 'omit' from source: magic vars 13531 1726882465.42568: variable 'ansible_distribution_major_version' from source: facts 13531 1726882465.42580: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882465.42587: variable 'omit' from source: magic vars 13531 1726882465.42648: variable 'omit' from source: magic vars 13531 1726882465.43392: variable 'omit' from source: magic vars 13531 1726882465.43434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882465.43476: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882465.43496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882465.43515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882465.43526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882465.43557: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882465.43564: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882465.43567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882465.43671: Set connection var ansible_pipelining to False 13531 1726882465.43676: Set connection var ansible_timeout to 10 13531 1726882465.43682: Set connection var ansible_shell_executable to /bin/sh 13531 1726882465.43687: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882465.43690: Set connection var ansible_connection to ssh 13531 1726882465.43692: Set connection var ansible_shell_type to sh 13531 1726882465.43719: variable 'ansible_shell_executable' from source: unknown 13531 1726882465.43722: variable 'ansible_connection' from source: unknown 13531 1726882465.43725: variable 'ansible_module_compression' from source: unknown 13531 1726882465.43727: variable 'ansible_shell_type' from source: unknown 13531 1726882465.43730: variable 'ansible_shell_executable' from source: unknown 13531 1726882465.43732: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882465.43736: variable 'ansible_pipelining' from source: unknown 13531 1726882465.43738: variable 'ansible_timeout' from source: unknown 13531 1726882465.43742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882465.44452: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13531 1726882465.44468: variable 'omit' from source: magic vars 13531 1726882465.44475: starting attempt loop 13531 1726882465.44478: running the handler 13531 1726882465.44492: _low_level_execute_command(): starting 13531 1726882465.44499: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882465.45772: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882465.46482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.46493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.46508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.46550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.46556: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882465.46571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.46585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882465.46594: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882465.46601: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882465.46609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.46618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.46630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.46637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.46645: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882465.46657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.46733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882465.46755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882465.46771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882465.46905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882465.48574: stdout chunk (state=3): >>>/root <<< 13531 1726882465.48741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882465.48746: stdout chunk (state=3): >>><<< 13531 1726882465.48757: stderr chunk (state=3): >>><<< 13531 1726882465.48778: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882465.48791: _low_level_execute_command(): starting 13531 1726882465.48799: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882465.48778-15894-65541506547390 `" && echo ansible-tmp-1726882465.48778-15894-65541506547390="` echo /root/.ansible/tmp/ansible-tmp-1726882465.48778-15894-65541506547390 `" ) && sleep 0' 13531 1726882465.50316: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882465.50381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.50392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.50406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.50444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.50579: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882465.50589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.50602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882465.50610: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882465.50617: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882465.50625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.50634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.50645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.50659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.50662: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882465.50669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.50738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882465.50760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882465.50770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882465.50897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882465.52780: stdout chunk (state=3): >>>ansible-tmp-1726882465.48778-15894-65541506547390=/root/.ansible/tmp/ansible-tmp-1726882465.48778-15894-65541506547390 <<< 13531 1726882465.52957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882465.52965: stdout chunk (state=3): >>><<< 13531 1726882465.52973: stderr chunk (state=3): >>><<< 13531 1726882465.52999: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882465.48778-15894-65541506547390=/root/.ansible/tmp/ansible-tmp-1726882465.48778-15894-65541506547390 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882465.53045: variable 'ansible_module_compression' from source: unknown 13531 1726882465.53089: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13531 1726882465.53124: variable 'ansible_facts' from source: unknown 13531 1726882465.53192: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882465.48778-15894-65541506547390/AnsiballZ_ping.py 13531 1726882465.53407: Sending initial data 13531 1726882465.53410: Sent initial data (150 bytes) 13531 1726882465.54410: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882465.54418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.54428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.54442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.54490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.54496: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882465.54506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.54518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882465.54525: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882465.54532: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882465.54539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.54548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.54561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.54569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.54577: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882465.54592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.54669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882465.54686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882465.54704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882465.54830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882465.56577: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882465.56939: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882465.56943: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpad8c7wmc /root/.ansible/tmp/ansible-tmp-1726882465.48778-15894-65541506547390/AnsiballZ_ping.py <<< 13531 1726882465.56945: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882465.58294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882465.58374: stderr chunk (state=3): >>><<< 13531 1726882465.58378: stdout chunk (state=3): >>><<< 13531 1726882465.58398: done transferring module to remote 13531 1726882465.58409: _low_level_execute_command(): starting 13531 1726882465.58415: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882465.48778-15894-65541506547390/ /root/.ansible/tmp/ansible-tmp-1726882465.48778-15894-65541506547390/AnsiballZ_ping.py && sleep 0' 13531 1726882465.59043: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.59047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.59089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.59094: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 13531 1726882465.59107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.59112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882465.59126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.59200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882465.59206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882465.59215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882465.59339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882465.61175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882465.61217: stderr chunk (state=3): >>><<< 13531 1726882465.61220: stdout chunk (state=3): >>><<< 13531 1726882465.61237: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882465.61241: _low_level_execute_command(): starting 13531 1726882465.61243: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882465.48778-15894-65541506547390/AnsiballZ_ping.py && sleep 0' 13531 1726882465.61689: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.61692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.61725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.61731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.61734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.61804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882465.61807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882465.61923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882465.74812: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13531 1726882465.75859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882465.75865: stdout chunk (state=3): >>><<< 13531 1726882465.75880: stderr chunk (state=3): >>><<< 13531 1726882465.75891: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882465.75917: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882465.48778-15894-65541506547390/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882465.75926: _low_level_execute_command(): starting 13531 1726882465.75933: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882465.48778-15894-65541506547390/ > /dev/null 2>&1 && sleep 0' 13531 1726882465.76635: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882465.76639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.76657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.76676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.76708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.76715: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882465.76724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.76742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882465.76750: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882465.76772: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882465.76778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.76786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.76805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.76808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.76815: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882465.76835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.76894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882465.76911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882465.76921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882465.77072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882465.78879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882465.78917: stderr chunk (state=3): >>><<< 13531 1726882465.78920: stdout chunk (state=3): >>><<< 13531 1726882465.78999: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882465.79002: handler run complete 13531 1726882465.79004: attempt loop complete, returning result 13531 1726882465.79006: _execute() done 13531 1726882465.79007: dumping result to json 13531 1726882465.79009: done dumping result, returning 13531 1726882465.79011: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4fd9-519d-000000000181] 13531 1726882465.79013: sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000181 13531 1726882465.79096: done sending task result for task 0e448fcc-3ce9-4fd9-519d-000000000181 13531 1726882465.79099: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 13531 1726882465.79166: no more pending results, returning what we have 13531 1726882465.79170: results queue empty 13531 1726882465.79171: checking for any_errors_fatal 13531 1726882465.79177: done checking for any_errors_fatal 13531 1726882465.79178: checking for max_fail_percentage 13531 1726882465.79180: done checking for max_fail_percentage 13531 1726882465.79181: checking to see if all hosts have failed and the running result is not ok 13531 1726882465.79182: done checking to see if all hosts have failed 13531 1726882465.79182: getting the remaining hosts for this loop 13531 1726882465.79184: done getting the remaining hosts for this loop 13531 1726882465.79187: getting the next task for host managed_node2 13531 1726882465.79196: done getting next task for host managed_node2 13531 1726882465.79199: ^ task is: TASK: meta (role_complete) 13531 1726882465.79203: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882465.79216: getting variables 13531 1726882465.79217: in VariableManager get_vars() 13531 1726882465.79272: Calling all_inventory to load vars for managed_node2 13531 1726882465.79275: Calling groups_inventory to load vars for managed_node2 13531 1726882465.79277: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882465.79287: Calling all_plugins_play to load vars for managed_node2 13531 1726882465.79289: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882465.79292: Calling groups_plugins_play to load vars for managed_node2 13531 1726882465.80269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882465.81346: done with get_vars() 13531 1726882465.81373: done getting variables 13531 1726882465.81471: done queuing things up, now waiting for results queue to drain 13531 1726882465.81474: results queue empty 13531 1726882465.81474: checking for any_errors_fatal 13531 1726882465.81477: done checking for any_errors_fatal 13531 1726882465.81478: checking for max_fail_percentage 13531 1726882465.81479: done checking for max_fail_percentage 13531 1726882465.81480: checking to see if all hosts have failed and the running result is not ok 13531 1726882465.81484: done checking to see if all hosts have failed 13531 1726882465.81484: getting the remaining hosts for this loop 13531 1726882465.81486: done getting the remaining hosts for this loop 13531 1726882465.81489: getting the next task for host managed_node2 13531 1726882465.81494: done getting next task for host managed_node2 13531 1726882465.81496: ^ task is: TASK: Delete the device '{{ controller_device }}' 13531 1726882465.81500: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882465.81503: getting variables 13531 1726882465.81507: in VariableManager get_vars() 13531 1726882465.81533: Calling all_inventory to load vars for managed_node2 13531 1726882465.81535: Calling groups_inventory to load vars for managed_node2 13531 1726882465.81537: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882465.81546: Calling all_plugins_play to load vars for managed_node2 13531 1726882465.81550: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882465.81554: Calling groups_plugins_play to load vars for managed_node2 13531 1726882465.82697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882465.83879: done with get_vars() 13531 1726882465.83895: done getting variables 13531 1726882465.83923: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13531 1726882465.84013: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:242 Friday 20 September 2024 21:34:25 -0400 (0:00:00.446) 0:00:53.735 ****** 13531 1726882465.84036: entering _queue_task() for managed_node2/command 13531 1726882465.84277: worker is 1 (out of 1 available) 13531 1726882465.84290: exiting _queue_task() for managed_node2/command 13531 1726882465.84303: done queuing things up, now waiting for results queue to drain 13531 1726882465.84305: waiting for pending results... 13531 1726882465.84492: running TaskExecutor() for managed_node2/TASK: Delete the device 'nm-bond' 13531 1726882465.84566: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001b1 13531 1726882465.84658: variable 'ansible_search_path' from source: unknown 13531 1726882465.84665: calling self._execute() 13531 1726882465.84731: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882465.84735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882465.84776: variable 'omit' from source: magic vars 13531 1726882465.85098: variable 'ansible_distribution_major_version' from source: facts 13531 1726882465.85116: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882465.85147: variable 'omit' from source: magic vars 13531 1726882465.85151: variable 'omit' from source: magic vars 13531 1726882465.85228: variable 'controller_device' from source: play vars 13531 1726882465.85241: variable 'omit' from source: magic vars 13531 1726882465.85295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882465.85316: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882465.85331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882465.85345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882465.85356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882465.85381: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882465.85384: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882465.85386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882465.85460: Set connection var ansible_pipelining to False 13531 1726882465.85465: Set connection var ansible_timeout to 10 13531 1726882465.85470: Set connection var ansible_shell_executable to /bin/sh 13531 1726882465.85475: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882465.85477: Set connection var ansible_connection to ssh 13531 1726882465.85479: Set connection var ansible_shell_type to sh 13531 1726882465.85500: variable 'ansible_shell_executable' from source: unknown 13531 1726882465.85503: variable 'ansible_connection' from source: unknown 13531 1726882465.85506: variable 'ansible_module_compression' from source: unknown 13531 1726882465.85508: variable 'ansible_shell_type' from source: unknown 13531 1726882465.85510: variable 'ansible_shell_executable' from source: unknown 13531 1726882465.85514: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882465.85516: variable 'ansible_pipelining' from source: unknown 13531 1726882465.85518: variable 'ansible_timeout' from source: unknown 13531 1726882465.85524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882465.85630: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882465.85643: variable 'omit' from source: magic vars 13531 1726882465.85646: starting attempt loop 13531 1726882465.85648: running the handler 13531 1726882465.85664: _low_level_execute_command(): starting 13531 1726882465.85672: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882465.86313: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.86331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.86345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.86389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.86409: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.86456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882465.86471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882465.86478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882465.86593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882465.88263: stdout chunk (state=3): >>>/root <<< 13531 1726882465.88368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882465.88449: stderr chunk (state=3): >>><<< 13531 1726882465.88452: stdout chunk (state=3): >>><<< 13531 1726882465.88487: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882465.88502: _low_level_execute_command(): starting 13531 1726882465.88506: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882465.8848777-15922-278313155966382 `" && echo ansible-tmp-1726882465.8848777-15922-278313155966382="` echo /root/.ansible/tmp/ansible-tmp-1726882465.8848777-15922-278313155966382 `" ) && sleep 0' 13531 1726882465.89080: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882465.89089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.89107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.89137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.89253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.89271: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882465.89283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.89293: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882465.89295: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882465.89298: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882465.89301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.89303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.89313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.89316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.89329: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882465.89359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.89508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882465.89518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882465.89520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882465.89667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882465.91508: stdout chunk (state=3): >>>ansible-tmp-1726882465.8848777-15922-278313155966382=/root/.ansible/tmp/ansible-tmp-1726882465.8848777-15922-278313155966382 <<< 13531 1726882465.91614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882465.91713: stderr chunk (state=3): >>><<< 13531 1726882465.91716: stdout chunk (state=3): >>><<< 13531 1726882465.91730: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882465.8848777-15922-278313155966382=/root/.ansible/tmp/ansible-tmp-1726882465.8848777-15922-278313155966382 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882465.91760: variable 'ansible_module_compression' from source: unknown 13531 1726882465.91800: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13531 1726882465.91830: variable 'ansible_facts' from source: unknown 13531 1726882465.91901: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882465.8848777-15922-278313155966382/AnsiballZ_command.py 13531 1726882465.92026: Sending initial data 13531 1726882465.92030: Sent initial data (156 bytes) 13531 1726882465.94188: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882465.94206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.94227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.94247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.94319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.94336: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882465.94351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.94410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882465.94429: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882465.94445: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882465.94457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.94473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.94511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.94526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.94542: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882465.94561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.94752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882465.94769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882465.94794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882465.94958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882465.96759: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 13531 1726882465.96778: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882465.96869: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882465.96959: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmp8mqz3reh /root/.ansible/tmp/ansible-tmp-1726882465.8848777-15922-278313155966382/AnsiballZ_command.py <<< 13531 1726882465.97067: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882465.98812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882465.98892: stderr chunk (state=3): >>><<< 13531 1726882465.98895: stdout chunk (state=3): >>><<< 13531 1726882465.98941: done transferring module to remote 13531 1726882465.98944: _low_level_execute_command(): starting 13531 1726882465.98946: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882465.8848777-15922-278313155966382/ /root/.ansible/tmp/ansible-tmp-1726882465.8848777-15922-278313155966382/AnsiballZ_command.py && sleep 0' 13531 1726882465.99671: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882465.99675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.99681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.99683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.99708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.99715: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882465.99724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.99736: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882465.99743: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882465.99749: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882465.99756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882465.99769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882465.99781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882465.99786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882465.99814: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882465.99817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882465.99886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882465.99903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882465.99910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882466.00186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882466.02047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882466.02051: stdout chunk (state=3): >>><<< 13531 1726882466.02060: stderr chunk (state=3): >>><<< 13531 1726882466.02078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882466.02081: _low_level_execute_command(): starting 13531 1726882466.02084: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882465.8848777-15922-278313155966382/AnsiballZ_command.py && sleep 0' 13531 1726882466.02701: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882466.02710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.02719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.02732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.02767: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.02779: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882466.02788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.02801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882466.02808: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882466.02815: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882466.02823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.02832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.02843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.02850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.02859: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882466.02868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.02940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882466.02956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882466.02965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882466.03137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882466.17083: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:34:26.161452", "end": "2024-09-20 21:34:26.168917", "delta": "0:00:00.007465", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882466.18170: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 13531 1726882466.18222: stderr chunk (state=3): >>><<< 13531 1726882466.18225: stdout chunk (state=3): >>><<< 13531 1726882466.18241: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:34:26.161452", "end": "2024-09-20 21:34:26.168917", "delta": "0:00:00.007465", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 13531 1726882466.18274: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882465.8848777-15922-278313155966382/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882466.18280: _low_level_execute_command(): starting 13531 1726882466.18285: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882465.8848777-15922-278313155966382/ > /dev/null 2>&1 && sleep 0' 13531 1726882466.18721: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.18724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.18782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.18785: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.18788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882466.18790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.18841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882466.18847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882466.18850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882466.18944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882466.20762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882466.20810: stderr chunk (state=3): >>><<< 13531 1726882466.20813: stdout chunk (state=3): >>><<< 13531 1726882466.20825: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882466.20831: handler run complete 13531 1726882466.20849: Evaluated conditional (False): False 13531 1726882466.20852: Evaluated conditional (False): False 13531 1726882466.20862: attempt loop complete, returning result 13531 1726882466.20866: _execute() done 13531 1726882466.20869: dumping result to json 13531 1726882466.20874: done dumping result, returning 13531 1726882466.20881: done running TaskExecutor() for managed_node2/TASK: Delete the device 'nm-bond' [0e448fcc-3ce9-4fd9-519d-0000000001b1] 13531 1726882466.20889: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001b1 13531 1726882466.20990: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001b1 13531 1726882466.20994: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007465", "end": "2024-09-20 21:34:26.168917", "failed_when_result": false, "rc": 1, "start": "2024-09-20 21:34:26.161452" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 13531 1726882466.21069: no more pending results, returning what we have 13531 1726882466.21073: results queue empty 13531 1726882466.21074: checking for any_errors_fatal 13531 1726882466.21076: done checking for any_errors_fatal 13531 1726882466.21077: checking for max_fail_percentage 13531 1726882466.21078: done checking for max_fail_percentage 13531 1726882466.21079: checking to see if all hosts have failed and the running result is not ok 13531 1726882466.21080: done checking to see if all hosts have failed 13531 1726882466.21081: getting the remaining hosts for this loop 13531 1726882466.21082: done getting the remaining hosts for this loop 13531 1726882466.21085: getting the next task for host managed_node2 13531 1726882466.21093: done getting next task for host managed_node2 13531 1726882466.21096: ^ task is: TASK: Remove test interfaces 13531 1726882466.21099: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882466.21104: getting variables 13531 1726882466.21106: in VariableManager get_vars() 13531 1726882466.21158: Calling all_inventory to load vars for managed_node2 13531 1726882466.21160: Calling groups_inventory to load vars for managed_node2 13531 1726882466.21163: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882466.21175: Calling all_plugins_play to load vars for managed_node2 13531 1726882466.21177: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882466.21180: Calling groups_plugins_play to load vars for managed_node2 13531 1726882466.22067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882466.23047: done with get_vars() 13531 1726882466.23068: done getting variables 13531 1726882466.23112: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:34:26 -0400 (0:00:00.391) 0:00:54.126 ****** 13531 1726882466.23149: entering _queue_task() for managed_node2/shell 13531 1726882466.23371: worker is 1 (out of 1 available) 13531 1726882466.23381: exiting _queue_task() for managed_node2/shell 13531 1726882466.23392: done queuing things up, now waiting for results queue to drain 13531 1726882466.23393: waiting for pending results... 13531 1726882466.23652: running TaskExecutor() for managed_node2/TASK: Remove test interfaces 13531 1726882466.23807: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001b5 13531 1726882466.23827: variable 'ansible_search_path' from source: unknown 13531 1726882466.23834: variable 'ansible_search_path' from source: unknown 13531 1726882466.23884: calling self._execute() 13531 1726882466.23997: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882466.24008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882466.24025: variable 'omit' from source: magic vars 13531 1726882466.24429: variable 'ansible_distribution_major_version' from source: facts 13531 1726882466.24447: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882466.24462: variable 'omit' from source: magic vars 13531 1726882466.24521: variable 'omit' from source: magic vars 13531 1726882466.24674: variable 'dhcp_interface1' from source: play vars 13531 1726882466.24677: variable 'dhcp_interface2' from source: play vars 13531 1726882466.24692: variable 'omit' from source: magic vars 13531 1726882466.24725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882466.24750: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882466.24771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882466.24785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882466.24795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882466.24818: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882466.24821: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882466.24823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882466.24902: Set connection var ansible_pipelining to False 13531 1726882466.24905: Set connection var ansible_timeout to 10 13531 1726882466.24911: Set connection var ansible_shell_executable to /bin/sh 13531 1726882466.24916: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882466.24918: Set connection var ansible_connection to ssh 13531 1726882466.24921: Set connection var ansible_shell_type to sh 13531 1726882466.24939: variable 'ansible_shell_executable' from source: unknown 13531 1726882466.24942: variable 'ansible_connection' from source: unknown 13531 1726882466.24945: variable 'ansible_module_compression' from source: unknown 13531 1726882466.24947: variable 'ansible_shell_type' from source: unknown 13531 1726882466.24949: variable 'ansible_shell_executable' from source: unknown 13531 1726882466.24951: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882466.24958: variable 'ansible_pipelining' from source: unknown 13531 1726882466.24961: variable 'ansible_timeout' from source: unknown 13531 1726882466.24964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882466.25071: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882466.25084: variable 'omit' from source: magic vars 13531 1726882466.25087: starting attempt loop 13531 1726882466.25089: running the handler 13531 1726882466.25097: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882466.25112: _low_level_execute_command(): starting 13531 1726882466.25119: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882466.25616: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.25639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.25653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.25668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.25715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882466.25725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882466.25837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882466.27446: stdout chunk (state=3): >>>/root <<< 13531 1726882466.27582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882466.27649: stderr chunk (state=3): >>><<< 13531 1726882466.27652: stdout chunk (state=3): >>><<< 13531 1726882466.27771: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882466.27775: _low_level_execute_command(): starting 13531 1726882466.27778: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882466.2767825-15942-263281833461252 `" && echo ansible-tmp-1726882466.2767825-15942-263281833461252="` echo /root/.ansible/tmp/ansible-tmp-1726882466.2767825-15942-263281833461252 `" ) && sleep 0' 13531 1726882466.28362: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882466.28369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.28379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.28406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882466.28409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.28412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.28467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882466.28474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882466.28482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882466.28622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882466.30533: stdout chunk (state=3): >>>ansible-tmp-1726882466.2767825-15942-263281833461252=/root/.ansible/tmp/ansible-tmp-1726882466.2767825-15942-263281833461252 <<< 13531 1726882466.30649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882466.30730: stderr chunk (state=3): >>><<< 13531 1726882466.30739: stdout chunk (state=3): >>><<< 13531 1726882466.30769: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882466.2767825-15942-263281833461252=/root/.ansible/tmp/ansible-tmp-1726882466.2767825-15942-263281833461252 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882466.31052: variable 'ansible_module_compression' from source: unknown 13531 1726882466.31057: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13531 1726882466.31060: variable 'ansible_facts' from source: unknown 13531 1726882466.31062: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882466.2767825-15942-263281833461252/AnsiballZ_command.py 13531 1726882466.31434: Sending initial data 13531 1726882466.31437: Sent initial data (156 bytes) 13531 1726882466.32479: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882466.32488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.32499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.32512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.32549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.32559: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882466.32571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.32591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882466.32596: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882466.32604: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882466.32612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.32621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.32633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.32641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.32649: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882466.32659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.32739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882466.32753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882466.32763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882466.32890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882466.34622: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882466.34715: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882466.34811: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpkc_2s6z_ /root/.ansible/tmp/ansible-tmp-1726882466.2767825-15942-263281833461252/AnsiballZ_command.py <<< 13531 1726882466.34904: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882466.36157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882466.36551: stderr chunk (state=3): >>><<< 13531 1726882466.36557: stdout chunk (state=3): >>><<< 13531 1726882466.36560: done transferring module to remote 13531 1726882466.36562: _low_level_execute_command(): starting 13531 1726882466.36570: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882466.2767825-15942-263281833461252/ /root/.ansible/tmp/ansible-tmp-1726882466.2767825-15942-263281833461252/AnsiballZ_command.py && sleep 0' 13531 1726882466.37273: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882466.37287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.37301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.37319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.37360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.37384: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882466.37398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.37415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882466.37427: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882466.37437: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882466.37448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.37466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.37482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.37493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.37503: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882466.37517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.37589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882466.37606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882466.37620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882466.37750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882466.39573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882466.39641: stderr chunk (state=3): >>><<< 13531 1726882466.39644: stdout chunk (state=3): >>><<< 13531 1726882466.39737: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882466.39742: _low_level_execute_command(): starting 13531 1726882466.39745: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882466.2767825-15942-263281833461252/AnsiballZ_command.py && sleep 0' 13531 1726882466.40347: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882466.40368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.40384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.40409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.40450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.40469: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882466.40485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.40502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882466.40519: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882466.40531: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882466.40544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.40561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.40581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.40593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.40605: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882466.40618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.40700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882466.40717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882466.40735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882466.40883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882466.60393: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:34:26.540514", "end": "2024-09-20 21:34:26.600364", "delta": "0:00:00.059850", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882466.61773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882466.61807: stderr chunk (state=3): >>><<< 13531 1726882466.61810: stdout chunk (state=3): >>><<< 13531 1726882466.61870: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:34:26.540514", "end": "2024-09-20 21:34:26.600364", "delta": "0:00:00.059850", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882466.61958: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882466.2767825-15942-263281833461252/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882466.61962: _low_level_execute_command(): starting 13531 1726882466.61969: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882466.2767825-15942-263281833461252/ > /dev/null 2>&1 && sleep 0' 13531 1726882466.62471: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.62478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.62523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.62526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.62528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.62580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882466.62583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882466.62688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882466.64503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882466.64592: stderr chunk (state=3): >>><<< 13531 1726882466.64596: stdout chunk (state=3): >>><<< 13531 1726882466.64624: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882466.64651: handler run complete 13531 1726882466.64685: Evaluated conditional (False): False 13531 1726882466.64699: attempt loop complete, returning result 13531 1726882466.64706: _execute() done 13531 1726882466.64710: dumping result to json 13531 1726882466.64729: done dumping result, returning 13531 1726882466.64742: done running TaskExecutor() for managed_node2/TASK: Remove test interfaces [0e448fcc-3ce9-4fd9-519d-0000000001b5] 13531 1726882466.64761: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001b5 13531 1726882466.64958: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001b5 13531 1726882466.64968: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.059850", "end": "2024-09-20 21:34:26.600364", "rc": 0, "start": "2024-09-20 21:34:26.540514" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 13531 1726882466.65078: no more pending results, returning what we have 13531 1726882466.65090: results queue empty 13531 1726882466.65092: checking for any_errors_fatal 13531 1726882466.65103: done checking for any_errors_fatal 13531 1726882466.65107: checking for max_fail_percentage 13531 1726882466.65109: done checking for max_fail_percentage 13531 1726882466.65110: checking to see if all hosts have failed and the running result is not ok 13531 1726882466.65111: done checking to see if all hosts have failed 13531 1726882466.65111: getting the remaining hosts for this loop 13531 1726882466.65113: done getting the remaining hosts for this loop 13531 1726882466.65117: getting the next task for host managed_node2 13531 1726882466.65122: done getting next task for host managed_node2 13531 1726882466.65124: ^ task is: TASK: Stop dnsmasq/radvd services 13531 1726882466.65128: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882466.65135: getting variables 13531 1726882466.65136: in VariableManager get_vars() 13531 1726882466.65218: Calling all_inventory to load vars for managed_node2 13531 1726882466.65222: Calling groups_inventory to load vars for managed_node2 13531 1726882466.65225: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882466.65255: Calling all_plugins_play to load vars for managed_node2 13531 1726882466.65258: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882466.65267: Calling groups_plugins_play to load vars for managed_node2 13531 1726882466.67010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882466.68529: done with get_vars() 13531 1726882466.68550: done getting variables 13531 1726882466.68599: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 21:34:26 -0400 (0:00:00.454) 0:00:54.581 ****** 13531 1726882466.68634: entering _queue_task() for managed_node2/shell 13531 1726882466.69075: worker is 1 (out of 1 available) 13531 1726882466.69088: exiting _queue_task() for managed_node2/shell 13531 1726882466.69099: done queuing things up, now waiting for results queue to drain 13531 1726882466.69100: waiting for pending results... 13531 1726882466.69396: running TaskExecutor() for managed_node2/TASK: Stop dnsmasq/radvd services 13531 1726882466.69533: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001b6 13531 1726882466.69550: variable 'ansible_search_path' from source: unknown 13531 1726882466.69556: variable 'ansible_search_path' from source: unknown 13531 1726882466.69591: calling self._execute() 13531 1726882466.69694: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882466.69699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882466.69708: variable 'omit' from source: magic vars 13531 1726882466.70129: variable 'ansible_distribution_major_version' from source: facts 13531 1726882466.70143: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882466.70146: variable 'omit' from source: magic vars 13531 1726882466.70207: variable 'omit' from source: magic vars 13531 1726882466.70248: variable 'omit' from source: magic vars 13531 1726882466.70290: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882466.70326: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882466.70348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882466.70372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882466.70395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882466.70417: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882466.70420: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882466.70422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882466.70506: Set connection var ansible_pipelining to False 13531 1726882466.70509: Set connection var ansible_timeout to 10 13531 1726882466.70518: Set connection var ansible_shell_executable to /bin/sh 13531 1726882466.70539: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882466.70543: Set connection var ansible_connection to ssh 13531 1726882466.70546: Set connection var ansible_shell_type to sh 13531 1726882466.70574: variable 'ansible_shell_executable' from source: unknown 13531 1726882466.70577: variable 'ansible_connection' from source: unknown 13531 1726882466.70580: variable 'ansible_module_compression' from source: unknown 13531 1726882466.70583: variable 'ansible_shell_type' from source: unknown 13531 1726882466.70585: variable 'ansible_shell_executable' from source: unknown 13531 1726882466.70587: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882466.70589: variable 'ansible_pipelining' from source: unknown 13531 1726882466.70593: variable 'ansible_timeout' from source: unknown 13531 1726882466.70595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882466.70712: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882466.70718: variable 'omit' from source: magic vars 13531 1726882466.70721: starting attempt loop 13531 1726882466.70723: running the handler 13531 1726882466.70732: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882466.70747: _low_level_execute_command(): starting 13531 1726882466.70753: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882466.71468: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.71487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.71504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.71602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882466.71621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882466.71661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882466.71758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882466.73832: stdout chunk (state=3): >>>/root <<< 13531 1726882466.73835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882466.73848: stdout chunk (state=3): >>><<< 13531 1726882466.73851: stderr chunk (state=3): >>><<< 13531 1726882466.73875: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882466.73892: _low_level_execute_command(): starting 13531 1726882466.73898: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882466.738738-15973-203183112368835 `" && echo ansible-tmp-1726882466.738738-15973-203183112368835="` echo /root/.ansible/tmp/ansible-tmp-1726882466.738738-15973-203183112368835 `" ) && sleep 0' 13531 1726882466.74808: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882466.74817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.74828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.74869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.74908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.74916: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882466.74927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.74940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882466.74949: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882466.74959: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882466.75023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.75034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.75046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.75056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.75059: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882466.75072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.75149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882466.75167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882466.75179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882466.75304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882466.77172: stdout chunk (state=3): >>>ansible-tmp-1726882466.738738-15973-203183112368835=/root/.ansible/tmp/ansible-tmp-1726882466.738738-15973-203183112368835 <<< 13531 1726882466.77319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882466.77411: stderr chunk (state=3): >>><<< 13531 1726882466.77436: stdout chunk (state=3): >>><<< 13531 1726882466.77484: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882466.738738-15973-203183112368835=/root/.ansible/tmp/ansible-tmp-1726882466.738738-15973-203183112368835 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882466.77525: variable 'ansible_module_compression' from source: unknown 13531 1726882466.77615: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13531 1726882466.77652: variable 'ansible_facts' from source: unknown 13531 1726882466.77846: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882466.738738-15973-203183112368835/AnsiballZ_command.py 13531 1726882466.78127: Sending initial data 13531 1726882466.78132: Sent initial data (155 bytes) 13531 1726882466.80757: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882466.80766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.80777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.80793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.80838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.80845: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882466.80859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.80870: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882466.80878: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882466.80884: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882466.80892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.80903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.80921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.80929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.80936: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882466.80945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.81019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882466.81042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882466.81068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882466.81188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882466.83005: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882466.83104: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882466.83205: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmp_8dcb59i /root/.ansible/tmp/ansible-tmp-1726882466.738738-15973-203183112368835/AnsiballZ_command.py <<< 13531 1726882466.83300: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882466.84577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882466.85069: stderr chunk (state=3): >>><<< 13531 1726882466.85073: stdout chunk (state=3): >>><<< 13531 1726882466.85075: done transferring module to remote 13531 1726882466.85077: _low_level_execute_command(): starting 13531 1726882466.85080: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882466.738738-15973-203183112368835/ /root/.ansible/tmp/ansible-tmp-1726882466.738738-15973-203183112368835/AnsiballZ_command.py && sleep 0' 13531 1726882466.85902: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882466.86027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.86038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.86053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.86092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.86132: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882466.86142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.86158: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882466.86161: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882466.86171: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882466.86179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.86227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.86230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.86241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.86248: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882466.86260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.86328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882466.86460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882466.86475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882466.86602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882466.88499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882466.88503: stdout chunk (state=3): >>><<< 13531 1726882466.88508: stderr chunk (state=3): >>><<< 13531 1726882466.88525: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882466.88528: _low_level_execute_command(): starting 13531 1726882466.88531: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882466.738738-15973-203183112368835/AnsiballZ_command.py && sleep 0' 13531 1726882466.90371: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882466.90383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.90394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.90408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.90447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.90457: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882466.90466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.90480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882466.90488: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882466.90499: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882466.90506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882466.90516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882466.90527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882466.90534: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882466.90541: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882466.90550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882466.90624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882466.90644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882466.90658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882466.90843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882467.06177: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:34:27.039361", "end": "2024-09-20 21:34:27.059624", "delta": "0:00:00.020263", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882467.07390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882467.07469: stderr chunk (state=3): >>><<< 13531 1726882467.07472: stdout chunk (state=3): >>><<< 13531 1726882467.07498: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:34:27.039361", "end": "2024-09-20 21:34:27.059624", "delta": "0:00:00.020263", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882467.07538: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882466.738738-15973-203183112368835/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882467.07547: _low_level_execute_command(): starting 13531 1726882467.07552: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882466.738738-15973-203183112368835/ > /dev/null 2>&1 && sleep 0' 13531 1726882467.08882: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.08887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.08927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882467.08934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.08957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.08961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 13531 1726882467.08977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.09057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882467.09076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882467.09080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882467.09200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882467.11343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882467.11410: stderr chunk (state=3): >>><<< 13531 1726882467.11413: stdout chunk (state=3): >>><<< 13531 1726882467.11431: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882467.11441: handler run complete 13531 1726882467.11462: Evaluated conditional (False): False 13531 1726882467.11475: attempt loop complete, returning result 13531 1726882467.11478: _execute() done 13531 1726882467.11480: dumping result to json 13531 1726882467.11485: done dumping result, returning 13531 1726882467.11494: done running TaskExecutor() for managed_node2/TASK: Stop dnsmasq/radvd services [0e448fcc-3ce9-4fd9-519d-0000000001b6] 13531 1726882467.11502: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001b6 13531 1726882467.11617: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001b6 13531 1726882467.11620: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.020263", "end": "2024-09-20 21:34:27.059624", "rc": 0, "start": "2024-09-20 21:34:27.039361" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 13531 1726882467.11685: no more pending results, returning what we have 13531 1726882467.11689: results queue empty 13531 1726882467.11690: checking for any_errors_fatal 13531 1726882467.11699: done checking for any_errors_fatal 13531 1726882467.11700: checking for max_fail_percentage 13531 1726882467.11701: done checking for max_fail_percentage 13531 1726882467.11702: checking to see if all hosts have failed and the running result is not ok 13531 1726882467.11703: done checking to see if all hosts have failed 13531 1726882467.11704: getting the remaining hosts for this loop 13531 1726882467.11705: done getting the remaining hosts for this loop 13531 1726882467.11708: getting the next task for host managed_node2 13531 1726882467.11715: done getting next task for host managed_node2 13531 1726882467.11718: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 13531 1726882467.11721: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882467.11725: getting variables 13531 1726882467.11727: in VariableManager get_vars() 13531 1726882467.11781: Calling all_inventory to load vars for managed_node2 13531 1726882467.11784: Calling groups_inventory to load vars for managed_node2 13531 1726882467.11786: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882467.11801: Calling all_plugins_play to load vars for managed_node2 13531 1726882467.11803: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882467.11806: Calling groups_plugins_play to load vars for managed_node2 13531 1726882467.14410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882467.17110: done with get_vars() 13531 1726882467.17134: done getting variables 13531 1726882467.17196: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:248 Friday 20 September 2024 21:34:27 -0400 (0:00:00.485) 0:00:55.067 ****** 13531 1726882467.17227: entering _queue_task() for managed_node2/command 13531 1726882467.17549: worker is 1 (out of 1 available) 13531 1726882467.17562: exiting _queue_task() for managed_node2/command 13531 1726882467.17577: done queuing things up, now waiting for results queue to drain 13531 1726882467.17578: waiting for pending results... 13531 1726882467.17864: running TaskExecutor() for managed_node2/TASK: Restore the /etc/resolv.conf for initscript 13531 1726882467.17970: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001b7 13531 1726882467.17987: variable 'ansible_search_path' from source: unknown 13531 1726882467.18025: calling self._execute() 13531 1726882467.18120: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882467.18124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882467.18141: variable 'omit' from source: magic vars 13531 1726882467.18622: variable 'ansible_distribution_major_version' from source: facts 13531 1726882467.18635: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882467.19571: variable 'network_provider' from source: set_fact 13531 1726882467.19575: Evaluated conditional (network_provider == "initscripts"): False 13531 1726882467.19577: when evaluation is False, skipping this task 13531 1726882467.19579: _execute() done 13531 1726882467.19581: dumping result to json 13531 1726882467.19583: done dumping result, returning 13531 1726882467.19586: done running TaskExecutor() for managed_node2/TASK: Restore the /etc/resolv.conf for initscript [0e448fcc-3ce9-4fd9-519d-0000000001b7] 13531 1726882467.19588: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001b7 13531 1726882467.19660: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001b7 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13531 1726882467.19702: no more pending results, returning what we have 13531 1726882467.19705: results queue empty 13531 1726882467.19706: checking for any_errors_fatal 13531 1726882467.19715: done checking for any_errors_fatal 13531 1726882467.19715: checking for max_fail_percentage 13531 1726882467.19717: done checking for max_fail_percentage 13531 1726882467.19718: checking to see if all hosts have failed and the running result is not ok 13531 1726882467.19719: done checking to see if all hosts have failed 13531 1726882467.19719: getting the remaining hosts for this loop 13531 1726882467.19722: done getting the remaining hosts for this loop 13531 1726882467.19725: getting the next task for host managed_node2 13531 1726882467.19731: done getting next task for host managed_node2 13531 1726882467.19734: ^ task is: TASK: Verify network state restored to default 13531 1726882467.19737: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882467.19741: getting variables 13531 1726882467.19743: in VariableManager get_vars() 13531 1726882467.19791: Calling all_inventory to load vars for managed_node2 13531 1726882467.19794: Calling groups_inventory to load vars for managed_node2 13531 1726882467.19797: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882467.19808: Calling all_plugins_play to load vars for managed_node2 13531 1726882467.19811: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882467.19816: Calling groups_plugins_play to load vars for managed_node2 13531 1726882467.20334: WORKER PROCESS EXITING 13531 1726882467.22079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882467.24175: done with get_vars() 13531 1726882467.24206: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:253 Friday 20 September 2024 21:34:27 -0400 (0:00:00.070) 0:00:55.138 ****** 13531 1726882467.24304: entering _queue_task() for managed_node2/include_tasks 13531 1726882467.24629: worker is 1 (out of 1 available) 13531 1726882467.24641: exiting _queue_task() for managed_node2/include_tasks 13531 1726882467.24654: done queuing things up, now waiting for results queue to drain 13531 1726882467.24656: waiting for pending results... 13531 1726882467.24942: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 13531 1726882467.25280: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000001b8 13531 1726882467.25294: variable 'ansible_search_path' from source: unknown 13531 1726882467.25332: calling self._execute() 13531 1726882467.25422: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882467.25426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882467.25443: variable 'omit' from source: magic vars 13531 1726882467.25838: variable 'ansible_distribution_major_version' from source: facts 13531 1726882467.25851: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882467.25859: _execute() done 13531 1726882467.25865: dumping result to json 13531 1726882467.25868: done dumping result, returning 13531 1726882467.25872: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [0e448fcc-3ce9-4fd9-519d-0000000001b8] 13531 1726882467.25883: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001b8 13531 1726882467.25983: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000001b8 13531 1726882467.25987: WORKER PROCESS EXITING 13531 1726882467.26016: no more pending results, returning what we have 13531 1726882467.26024: in VariableManager get_vars() 13531 1726882467.26096: Calling all_inventory to load vars for managed_node2 13531 1726882467.26099: Calling groups_inventory to load vars for managed_node2 13531 1726882467.26102: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882467.26118: Calling all_plugins_play to load vars for managed_node2 13531 1726882467.26122: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882467.26126: Calling groups_plugins_play to load vars for managed_node2 13531 1726882467.28019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882467.35258: done with get_vars() 13531 1726882467.35280: variable 'ansible_search_path' from source: unknown 13531 1726882467.35292: we have included files to process 13531 1726882467.35292: generating all_blocks data 13531 1726882467.35294: done generating all_blocks data 13531 1726882467.35295: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13531 1726882467.35296: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13531 1726882467.35297: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13531 1726882467.35573: done processing included file 13531 1726882467.35575: iterating over new_blocks loaded from include file 13531 1726882467.35575: in VariableManager get_vars() 13531 1726882467.35593: done with get_vars() 13531 1726882467.35594: filtering new block on tags 13531 1726882467.35613: done filtering new block on tags 13531 1726882467.35615: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 13531 1726882467.35618: extending task lists for all hosts with included blocks 13531 1726882467.36315: done extending task lists 13531 1726882467.36316: done processing included files 13531 1726882467.36317: results queue empty 13531 1726882467.36317: checking for any_errors_fatal 13531 1726882467.36319: done checking for any_errors_fatal 13531 1726882467.36320: checking for max_fail_percentage 13531 1726882467.36321: done checking for max_fail_percentage 13531 1726882467.36321: checking to see if all hosts have failed and the running result is not ok 13531 1726882467.36322: done checking to see if all hosts have failed 13531 1726882467.36322: getting the remaining hosts for this loop 13531 1726882467.36323: done getting the remaining hosts for this loop 13531 1726882467.36324: getting the next task for host managed_node2 13531 1726882467.36326: done getting next task for host managed_node2 13531 1726882467.36328: ^ task is: TASK: Check routes and DNS 13531 1726882467.36329: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882467.36331: getting variables 13531 1726882467.36331: in VariableManager get_vars() 13531 1726882467.36344: Calling all_inventory to load vars for managed_node2 13531 1726882467.36345: Calling groups_inventory to load vars for managed_node2 13531 1726882467.36347: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882467.36350: Calling all_plugins_play to load vars for managed_node2 13531 1726882467.36352: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882467.36354: Calling groups_plugins_play to load vars for managed_node2 13531 1726882467.37053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882467.38552: done with get_vars() 13531 1726882467.38575: done getting variables 13531 1726882467.38605: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:34:27 -0400 (0:00:00.143) 0:00:55.281 ****** 13531 1726882467.38625: entering _queue_task() for managed_node2/shell 13531 1726882467.38886: worker is 1 (out of 1 available) 13531 1726882467.38899: exiting _queue_task() for managed_node2/shell 13531 1726882467.38911: done queuing things up, now waiting for results queue to drain 13531 1726882467.38912: waiting for pending results... 13531 1726882467.39103: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 13531 1726882467.39192: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000009f0 13531 1726882467.39203: variable 'ansible_search_path' from source: unknown 13531 1726882467.39207: variable 'ansible_search_path' from source: unknown 13531 1726882467.39235: calling self._execute() 13531 1726882467.39312: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882467.39317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882467.39324: variable 'omit' from source: magic vars 13531 1726882467.39602: variable 'ansible_distribution_major_version' from source: facts 13531 1726882467.39612: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882467.39619: variable 'omit' from source: magic vars 13531 1726882467.39650: variable 'omit' from source: magic vars 13531 1726882467.39677: variable 'omit' from source: magic vars 13531 1726882467.39712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882467.39740: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882467.39757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882467.39771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882467.39780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882467.39804: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882467.39808: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882467.39810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882467.39885: Set connection var ansible_pipelining to False 13531 1726882467.39889: Set connection var ansible_timeout to 10 13531 1726882467.39895: Set connection var ansible_shell_executable to /bin/sh 13531 1726882467.39898: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882467.39900: Set connection var ansible_connection to ssh 13531 1726882467.39905: Set connection var ansible_shell_type to sh 13531 1726882467.39925: variable 'ansible_shell_executable' from source: unknown 13531 1726882467.39928: variable 'ansible_connection' from source: unknown 13531 1726882467.39931: variable 'ansible_module_compression' from source: unknown 13531 1726882467.39933: variable 'ansible_shell_type' from source: unknown 13531 1726882467.39936: variable 'ansible_shell_executable' from source: unknown 13531 1726882467.39938: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882467.39940: variable 'ansible_pipelining' from source: unknown 13531 1726882467.39943: variable 'ansible_timeout' from source: unknown 13531 1726882467.39953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882467.40044: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882467.40058: variable 'omit' from source: magic vars 13531 1726882467.40062: starting attempt loop 13531 1726882467.40064: running the handler 13531 1726882467.40074: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882467.40089: _low_level_execute_command(): starting 13531 1726882467.40095: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882467.40788: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.40893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.40986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882467.41011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882467.41035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882467.41170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882467.42827: stdout chunk (state=3): >>>/root <<< 13531 1726882467.42931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882467.42981: stderr chunk (state=3): >>><<< 13531 1726882467.42988: stdout chunk (state=3): >>><<< 13531 1726882467.43015: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882467.43041: _low_level_execute_command(): starting 13531 1726882467.43054: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882467.4302447-16016-207572973032302 `" && echo ansible-tmp-1726882467.4302447-16016-207572973032302="` echo /root/.ansible/tmp/ansible-tmp-1726882467.4302447-16016-207572973032302 `" ) && sleep 0' 13531 1726882467.43924: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882467.43949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.43969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.43983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.44026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.44029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.44032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.44085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882467.44088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882467.44222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882467.46080: stdout chunk (state=3): >>>ansible-tmp-1726882467.4302447-16016-207572973032302=/root/.ansible/tmp/ansible-tmp-1726882467.4302447-16016-207572973032302 <<< 13531 1726882467.46194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882467.46243: stderr chunk (state=3): >>><<< 13531 1726882467.46246: stdout chunk (state=3): >>><<< 13531 1726882467.46262: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882467.4302447-16016-207572973032302=/root/.ansible/tmp/ansible-tmp-1726882467.4302447-16016-207572973032302 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882467.46290: variable 'ansible_module_compression' from source: unknown 13531 1726882467.46334: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13531 1726882467.46364: variable 'ansible_facts' from source: unknown 13531 1726882467.46427: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882467.4302447-16016-207572973032302/AnsiballZ_command.py 13531 1726882467.46528: Sending initial data 13531 1726882467.46531: Sent initial data (156 bytes) 13531 1726882467.47486: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882467.47498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.47512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.47538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.47578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882467.47581: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.47603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882467.47615: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882467.47618: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882467.47626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.47635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.47656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.47712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882467.47718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882467.47827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882467.49587: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 13531 1726882467.49591: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882467.49679: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882467.49781: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmpz5ej2l06 /root/.ansible/tmp/ansible-tmp-1726882467.4302447-16016-207572973032302/AnsiballZ_command.py <<< 13531 1726882467.49878: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882467.50957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882467.51096: stderr chunk (state=3): >>><<< 13531 1726882467.51099: stdout chunk (state=3): >>><<< 13531 1726882467.51115: done transferring module to remote 13531 1726882467.51124: _low_level_execute_command(): starting 13531 1726882467.51130: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882467.4302447-16016-207572973032302/ /root/.ansible/tmp/ansible-tmp-1726882467.4302447-16016-207572973032302/AnsiballZ_command.py && sleep 0' 13531 1726882467.51593: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.51598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.51641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.51644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.51653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.51703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882467.51710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882467.51831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882467.53570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882467.53656: stderr chunk (state=3): >>><<< 13531 1726882467.53659: stdout chunk (state=3): >>><<< 13531 1726882467.53680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882467.53684: _low_level_execute_command(): starting 13531 1726882467.53686: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882467.4302447-16016-207572973032302/AnsiballZ_command.py && sleep 0' 13531 1726882467.54254: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.54257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.54307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.54311: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.54313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.54368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882467.54372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882467.54380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882467.54486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882467.68538: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3396sec preferred_lft 3396sec\n inet6 fe80::104f:68ff:fe7a:deb1/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:34:27.674979", "end": "2024-09-20 21:34:27.683399", "delta": "0:00:00.008420", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882467.69873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882467.69877: stdout chunk (state=3): >>><<< 13531 1726882467.69883: stderr chunk (state=3): >>><<< 13531 1726882467.69903: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3396sec preferred_lft 3396sec\n inet6 fe80::104f:68ff:fe7a:deb1/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:34:27.674979", "end": "2024-09-20 21:34:27.683399", "delta": "0:00:00.008420", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882467.69949: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882467.4302447-16016-207572973032302/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882467.69960: _low_level_execute_command(): starting 13531 1726882467.69967: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882467.4302447-16016-207572973032302/ > /dev/null 2>&1 && sleep 0' 13531 1726882467.70520: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.70523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.70553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.70556: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.70559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.70610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882467.70615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882467.70730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882467.72556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882467.72620: stderr chunk (state=3): >>><<< 13531 1726882467.72623: stdout chunk (state=3): >>><<< 13531 1726882467.72647: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882467.72654: handler run complete 13531 1726882467.72684: Evaluated conditional (False): False 13531 1726882467.72695: attempt loop complete, returning result 13531 1726882467.72698: _execute() done 13531 1726882467.72700: dumping result to json 13531 1726882467.72706: done dumping result, returning 13531 1726882467.72715: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0e448fcc-3ce9-4fd9-519d-0000000009f0] 13531 1726882467.72722: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000009f0 13531 1726882467.72839: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000009f0 13531 1726882467.72842: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008420", "end": "2024-09-20 21:34:27.683399", "rc": 0, "start": "2024-09-20 21:34:27.674979" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3396sec preferred_lft 3396sec inet6 fe80::104f:68ff:fe7a:deb1/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 13531 1726882467.73035: no more pending results, returning what we have 13531 1726882467.73038: results queue empty 13531 1726882467.73039: checking for any_errors_fatal 13531 1726882467.73041: done checking for any_errors_fatal 13531 1726882467.73041: checking for max_fail_percentage 13531 1726882467.73043: done checking for max_fail_percentage 13531 1726882467.73044: checking to see if all hosts have failed and the running result is not ok 13531 1726882467.73045: done checking to see if all hosts have failed 13531 1726882467.73046: getting the remaining hosts for this loop 13531 1726882467.73047: done getting the remaining hosts for this loop 13531 1726882467.73050: getting the next task for host managed_node2 13531 1726882467.73059: done getting next task for host managed_node2 13531 1726882467.73062: ^ task is: TASK: Verify DNS and network connectivity 13531 1726882467.73071: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13531 1726882467.73080: getting variables 13531 1726882467.73082: in VariableManager get_vars() 13531 1726882467.73138: Calling all_inventory to load vars for managed_node2 13531 1726882467.73141: Calling groups_inventory to load vars for managed_node2 13531 1726882467.73144: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882467.73158: Calling all_plugins_play to load vars for managed_node2 13531 1726882467.73161: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882467.73166: Calling groups_plugins_play to load vars for managed_node2 13531 1726882467.74974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882467.76890: done with get_vars() 13531 1726882467.76916: done getting variables 13531 1726882467.76981: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:34:27 -0400 (0:00:00.383) 0:00:55.665 ****** 13531 1726882467.77021: entering _queue_task() for managed_node2/shell 13531 1726882467.77376: worker is 1 (out of 1 available) 13531 1726882467.77388: exiting _queue_task() for managed_node2/shell 13531 1726882467.77400: done queuing things up, now waiting for results queue to drain 13531 1726882467.77401: waiting for pending results... 13531 1726882467.77718: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 13531 1726882467.77836: in run() - task 0e448fcc-3ce9-4fd9-519d-0000000009f1 13531 1726882467.77853: variable 'ansible_search_path' from source: unknown 13531 1726882467.77862: variable 'ansible_search_path' from source: unknown 13531 1726882467.77904: calling self._execute() 13531 1726882467.78014: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882467.78019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882467.78028: variable 'omit' from source: magic vars 13531 1726882467.78436: variable 'ansible_distribution_major_version' from source: facts 13531 1726882467.78448: Evaluated conditional (ansible_distribution_major_version != '6'): True 13531 1726882467.78619: variable 'ansible_facts' from source: unknown 13531 1726882467.79495: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 13531 1726882467.79505: variable 'omit' from source: magic vars 13531 1726882467.79552: variable 'omit' from source: magic vars 13531 1726882467.79586: variable 'omit' from source: magic vars 13531 1726882467.79644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13531 1726882467.79682: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13531 1726882467.79704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13531 1726882467.79728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882467.79738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13531 1726882467.79769: variable 'inventory_hostname' from source: host vars for 'managed_node2' 13531 1726882467.79773: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882467.79775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882467.79886: Set connection var ansible_pipelining to False 13531 1726882467.79892: Set connection var ansible_timeout to 10 13531 1726882467.79897: Set connection var ansible_shell_executable to /bin/sh 13531 1726882467.79903: Set connection var ansible_module_compression to ZIP_DEFLATED 13531 1726882467.79905: Set connection var ansible_connection to ssh 13531 1726882467.79908: Set connection var ansible_shell_type to sh 13531 1726882467.79946: variable 'ansible_shell_executable' from source: unknown 13531 1726882467.79950: variable 'ansible_connection' from source: unknown 13531 1726882467.79952: variable 'ansible_module_compression' from source: unknown 13531 1726882467.79955: variable 'ansible_shell_type' from source: unknown 13531 1726882467.79960: variable 'ansible_shell_executable' from source: unknown 13531 1726882467.79962: variable 'ansible_host' from source: host vars for 'managed_node2' 13531 1726882467.79968: variable 'ansible_pipelining' from source: unknown 13531 1726882467.79970: variable 'ansible_timeout' from source: unknown 13531 1726882467.79976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 13531 1726882467.80117: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882467.80126: variable 'omit' from source: magic vars 13531 1726882467.80132: starting attempt loop 13531 1726882467.80140: running the handler 13531 1726882467.80161: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13531 1726882467.80181: _low_level_execute_command(): starting 13531 1726882467.80188: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13531 1726882467.81014: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882467.81028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.81050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.81069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.81108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882467.81116: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882467.81126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.81141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882467.81163: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882467.81172: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882467.81180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.81190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.81202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.81209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882467.81217: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882467.81227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.81312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882467.81328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882467.81337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882467.81476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882467.83139: stdout chunk (state=3): >>>/root <<< 13531 1726882467.83246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882467.83324: stderr chunk (state=3): >>><<< 13531 1726882467.83337: stdout chunk (state=3): >>><<< 13531 1726882467.83370: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882467.83467: _low_level_execute_command(): starting 13531 1726882467.83472: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882467.8338218-16030-118793857988469 `" && echo ansible-tmp-1726882467.8338218-16030-118793857988469="` echo /root/.ansible/tmp/ansible-tmp-1726882467.8338218-16030-118793857988469 `" ) && sleep 0' 13531 1726882467.84011: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882467.84025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.84040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.84060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.84104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882467.84117: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882467.84131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.84148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882467.84158: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882467.84175: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882467.84186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.84198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.84213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.84224: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882467.84234: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882467.84246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.84320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882467.84342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882467.84359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882467.84494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882467.86397: stdout chunk (state=3): >>>ansible-tmp-1726882467.8338218-16030-118793857988469=/root/.ansible/tmp/ansible-tmp-1726882467.8338218-16030-118793857988469 <<< 13531 1726882467.86551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882467.86554: stdout chunk (state=3): >>><<< 13531 1726882467.86567: stderr chunk (state=3): >>><<< 13531 1726882467.86582: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882467.8338218-16030-118793857988469=/root/.ansible/tmp/ansible-tmp-1726882467.8338218-16030-118793857988469 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882467.86610: variable 'ansible_module_compression' from source: unknown 13531 1726882467.86666: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13531i_snf1g8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13531 1726882467.86699: variable 'ansible_facts' from source: unknown 13531 1726882467.86783: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882467.8338218-16030-118793857988469/AnsiballZ_command.py 13531 1726882467.86912: Sending initial data 13531 1726882467.86916: Sent initial data (156 bytes) 13531 1726882467.87839: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882467.87847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.87860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.87877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.87914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882467.87920: stderr chunk (state=3): >>>debug2: match not found <<< 13531 1726882467.87931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.87942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13531 1726882467.87949: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 13531 1726882467.87956: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13531 1726882467.87972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.87981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.87992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.87999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 13531 1726882467.88006: stderr chunk (state=3): >>>debug2: match found <<< 13531 1726882467.88014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.88089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13531 1726882467.88102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882467.88113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882467.88235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882467.89994: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13531 1726882467.90088: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13531 1726882467.90186: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13531i_snf1g8/tmp3qgui71v /root/.ansible/tmp/ansible-tmp-1726882467.8338218-16030-118793857988469/AnsiballZ_command.py <<< 13531 1726882467.90280: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13531 1726882467.91575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882467.91578: stdout chunk (state=3): >>><<< 13531 1726882467.91580: stderr chunk (state=3): >>><<< 13531 1726882467.91582: done transferring module to remote 13531 1726882467.91671: _low_level_execute_command(): starting 13531 1726882467.91678: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882467.8338218-16030-118793857988469/ /root/.ansible/tmp/ansible-tmp-1726882467.8338218-16030-118793857988469/AnsiballZ_command.py && sleep 0' 13531 1726882467.92200: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13531 1726882467.92210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.92219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.92251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.92294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.92298: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.92300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.92348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882467.92371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882467.92499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882467.94322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882467.94365: stderr chunk (state=3): >>><<< 13531 1726882467.94369: stdout chunk (state=3): >>><<< 13531 1726882467.94381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882467.94383: _low_level_execute_command(): starting 13531 1726882467.94388: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882467.8338218-16030-118793857988469/AnsiballZ_command.py && sleep 0' 13531 1726882467.94795: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882467.94800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882467.94831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.94847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13531 1726882467.94850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882467.94933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882467.94936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882467.95053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882468.38241: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 5865 0 --:--:-- --:--:-- --:--:-- 5865\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1310 0 --:--:-- --:--:-- --:--:-- 1310", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:34:28.081426", "end": "2024-09-20 21:34:28.380044", "delta": "0:00:00.298618", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13531 1726882468.39517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 13531 1726882468.39588: stderr chunk (state=3): >>><<< 13531 1726882468.39592: stdout chunk (state=3): >>><<< 13531 1726882468.39612: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 5865 0 --:--:-- --:--:-- --:--:-- 5865\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1310 0 --:--:-- --:--:-- --:--:-- 1310", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:34:28.081426", "end": "2024-09-20 21:34:28.380044", "delta": "0:00:00.298618", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 13531 1726882468.39648: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882467.8338218-16030-118793857988469/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13531 1726882468.39655: _low_level_execute_command(): starting 13531 1726882468.39675: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882467.8338218-16030-118793857988469/ > /dev/null 2>&1 && sleep 0' 13531 1726882468.40316: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882468.40319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13531 1726882468.40351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 13531 1726882468.40388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13531 1726882468.40391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13531 1726882468.40459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13531 1726882468.40495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13531 1726882468.40606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13531 1726882468.42443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13531 1726882468.42487: stderr chunk (state=3): >>><<< 13531 1726882468.42492: stdout chunk (state=3): >>><<< 13531 1726882468.42504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13531 1726882468.42511: handler run complete 13531 1726882468.42528: Evaluated conditional (False): False 13531 1726882468.42536: attempt loop complete, returning result 13531 1726882468.42539: _execute() done 13531 1726882468.42541: dumping result to json 13531 1726882468.42547: done dumping result, returning 13531 1726882468.42554: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0e448fcc-3ce9-4fd9-519d-0000000009f1] 13531 1726882468.42562: sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000009f1 13531 1726882468.42672: done sending task result for task 0e448fcc-3ce9-4fd9-519d-0000000009f1 13531 1726882468.42674: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.298618", "end": "2024-09-20 21:34:28.380044", "rc": 0, "start": "2024-09-20 21:34:28.081426" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 5865 0 --:--:-- --:--:-- --:--:-- 5865 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1310 0 --:--:-- --:--:-- --:--:-- 1310 13531 1726882468.42747: no more pending results, returning what we have 13531 1726882468.42750: results queue empty 13531 1726882468.42751: checking for any_errors_fatal 13531 1726882468.42769: done checking for any_errors_fatal 13531 1726882468.42770: checking for max_fail_percentage 13531 1726882468.42772: done checking for max_fail_percentage 13531 1726882468.42773: checking to see if all hosts have failed and the running result is not ok 13531 1726882468.42773: done checking to see if all hosts have failed 13531 1726882468.42774: getting the remaining hosts for this loop 13531 1726882468.42775: done getting the remaining hosts for this loop 13531 1726882468.42779: getting the next task for host managed_node2 13531 1726882468.42788: done getting next task for host managed_node2 13531 1726882468.42791: ^ task is: TASK: meta (flush_handlers) 13531 1726882468.42793: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882468.42797: getting variables 13531 1726882468.42798: in VariableManager get_vars() 13531 1726882468.42847: Calling all_inventory to load vars for managed_node2 13531 1726882468.42849: Calling groups_inventory to load vars for managed_node2 13531 1726882468.42851: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882468.42862: Calling all_plugins_play to load vars for managed_node2 13531 1726882468.42872: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882468.42877: Calling groups_plugins_play to load vars for managed_node2 13531 1726882468.43705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882468.44733: done with get_vars() 13531 1726882468.44748: done getting variables 13531 1726882468.44799: in VariableManager get_vars() 13531 1726882468.44814: Calling all_inventory to load vars for managed_node2 13531 1726882468.44816: Calling groups_inventory to load vars for managed_node2 13531 1726882468.44817: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882468.44820: Calling all_plugins_play to load vars for managed_node2 13531 1726882468.44822: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882468.44823: Calling groups_plugins_play to load vars for managed_node2 13531 1726882468.45511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882468.46452: done with get_vars() 13531 1726882468.46472: done queuing things up, now waiting for results queue to drain 13531 1726882468.46474: results queue empty 13531 1726882468.46474: checking for any_errors_fatal 13531 1726882468.46477: done checking for any_errors_fatal 13531 1726882468.46477: checking for max_fail_percentage 13531 1726882468.46478: done checking for max_fail_percentage 13531 1726882468.46478: checking to see if all hosts have failed and the running result is not ok 13531 1726882468.46479: done checking to see if all hosts have failed 13531 1726882468.46479: getting the remaining hosts for this loop 13531 1726882468.46480: done getting the remaining hosts for this loop 13531 1726882468.46482: getting the next task for host managed_node2 13531 1726882468.46485: done getting next task for host managed_node2 13531 1726882468.46486: ^ task is: TASK: meta (flush_handlers) 13531 1726882468.46487: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882468.46489: getting variables 13531 1726882468.46490: in VariableManager get_vars() 13531 1726882468.46502: Calling all_inventory to load vars for managed_node2 13531 1726882468.46503: Calling groups_inventory to load vars for managed_node2 13531 1726882468.46505: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882468.46508: Calling all_plugins_play to load vars for managed_node2 13531 1726882468.46510: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882468.46511: Calling groups_plugins_play to load vars for managed_node2 13531 1726882468.47231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882468.48138: done with get_vars() 13531 1726882468.48152: done getting variables 13531 1726882468.48187: in VariableManager get_vars() 13531 1726882468.48199: Calling all_inventory to load vars for managed_node2 13531 1726882468.48201: Calling groups_inventory to load vars for managed_node2 13531 1726882468.48202: Calling all_plugins_inventory to load vars for managed_node2 13531 1726882468.48205: Calling all_plugins_play to load vars for managed_node2 13531 1726882468.48210: Calling groups_plugins_inventory to load vars for managed_node2 13531 1726882468.48212: Calling groups_plugins_play to load vars for managed_node2 13531 1726882468.48882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13531 1726882468.49810: done with get_vars() 13531 1726882468.49827: done queuing things up, now waiting for results queue to drain 13531 1726882468.49828: results queue empty 13531 1726882468.49829: checking for any_errors_fatal 13531 1726882468.49830: done checking for any_errors_fatal 13531 1726882468.49830: checking for max_fail_percentage 13531 1726882468.49831: done checking for max_fail_percentage 13531 1726882468.49831: checking to see if all hosts have failed and the running result is not ok 13531 1726882468.49832: done checking to see if all hosts have failed 13531 1726882468.49832: getting the remaining hosts for this loop 13531 1726882468.49833: done getting the remaining hosts for this loop 13531 1726882468.49835: getting the next task for host managed_node2 13531 1726882468.49838: done getting next task for host managed_node2 13531 1726882468.49839: ^ task is: None 13531 1726882468.49840: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13531 1726882468.49840: done queuing things up, now waiting for results queue to drain 13531 1726882468.49841: results queue empty 13531 1726882468.49841: checking for any_errors_fatal 13531 1726882468.49842: done checking for any_errors_fatal 13531 1726882468.49842: checking for max_fail_percentage 13531 1726882468.49843: done checking for max_fail_percentage 13531 1726882468.49843: checking to see if all hosts have failed and the running result is not ok 13531 1726882468.49844: done checking to see if all hosts have failed 13531 1726882468.49845: getting the next task for host managed_node2 13531 1726882468.49846: done getting next task for host managed_node2 13531 1726882468.49847: ^ task is: None 13531 1726882468.49847: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=109 changed=5 unreachable=0 failed=0 skipped=120 rescued=0 ignored=0 Friday 20 September 2024 21:34:28 -0400 (0:00:00.729) 0:00:56.394 ****** =============================================================================== Create test interfaces -------------------------------------------------- 1.77s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Gathering Facts --------------------------------------------------------- 1.77s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 1.76s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.72s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.70s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.67s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.66s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install dnsmasq --------------------------------------------------------- 1.60s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Install pgrep, sysctl --------------------------------------------------- 1.44s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Gathering Facts --------------------------------------------------------- 1.43s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:6 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.25s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.10s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.06s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.98s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.98s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.94s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.91s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.84s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.82s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 13531 1726882468.49967: RUNNING CLEANUP